| NEUROCOMPUTING | 卷:388 |
| Riemann-Theta Boltzmann machine | |
| Article | |
| Krefl, Daniel1  Carrazza, Stefano1  Haghighat, Babak2  Kahlen, Jens3  | |
| [1] CERN, Theoret Phys Dept, CH-1211 Geneva 23, Switzerland | |
| [2] Tsinghua Univ, Yau Math Sci Ctr, Beijing 100084, Peoples R China | |
| [3] Dachsbau 8, D-53757 St Augustin, Germany | |
| 关键词: Boltzmann machines; Neural networks; Riemann-Theta function; Density estimation; Data classification; | |
| DOI : 10.1016/j.neucom.2020.01.011 | |
| 来源: Elsevier | |
PDF
|
|
【 摘 要 】
A general Boltzmann machine with continuous visible and discrete integer valued hidden states is introduced. Under mild assumptions about the connection matrices, the probability density function of the visible units can be solved for analytically, yielding a novel parametric density function involving a ratio of Riemann-Theta functions. The conditional expectation of a hidden state for given visible states can also be calculated analytically, yielding a derivative of the logarithmic Riemann-Theta function. The conditional expectation can be used as activation function in a feedforward neural network, thereby increasing the modelling capacity of the network. Both the Boltzmann machine and the derived feedforward neural network can be successfully trained via standard gradient- and non-gradient-based optimization techniques. (C) 2020 Elsevier B.V. All rights reserved.
【 授权许可】
Free
【 预 览 】
| Files | Size | Format | View |
|---|---|---|---|
| 10_1016_j_neucom_2020_01_011.pdf | 1295KB |
PDF