会议论文详细信息
30th International Conference on Machine Learning
Approximation properties of DBNs with binary hidden units and realvalued visible units
Oswin Krause Oswin.Krause@diku.dk ; Institut fu¨r Neuroinformatik ; Ruhr-Universita¨t Bochum ; 44780 Bochum ; Germany ; and ; Institut fu¨r Neuroinformatik ; Ruhr-Universita¨t Bochum ; 44780 Bochum ; Germany
PID  :  118331
来源: CEUR
PDF
【 摘 要 】
Deep belief networks (DBNs) can approxi mate any distribution over fixedlength bi nary vectors. However, DBNs are frequently applied to model realvalued data, and so far little is known about their representational power in this case. We analyze the approx imation properties of DBNs with two layers of binary hidden units and visible units with conditional distributions from the exponen tial family. It is shown that these DBNs can, under mild assumptions, model any additive mixture of distributions from the exponential family with independent vari ables. An arbitrarily good approximation in terms of KullbackLeibler divergence of an mdimensional mixture distribution with n components can be achieved by a DBN with m visible variables and n and n + 1 hid den variables in the first and second hidden layer, respectively. Furthermore, relevant in finite mixtures can be approximated arbitrar ily well by a DBN with a finite number of neurons. This includes the important special case of an infinite mixture of Gaussian dis tributions with fixed variance restricted to a compact domain, which in turn can approx imate any strictly positive density over this domain. Proceedings of the 30 th International Conference on Ma chine Learning, Atlanta, Georgia, USA, 2013. JMLR:
【 预 览 】
附件列表
Files Size Format View
Approximation properties of DBNs with binary hidden units and realvalued visible units 339KB PDF download
  文献评价指标  
  下载次数:57次 浏览次数:36次