Entropy | |
An Assessment of Hermite Function Based Approximations of Mutual Information Applied to Independent Component Analysis | |
关键词: ICA; nonparametric estimation; Hermite functions; kernel density estimation; | |
DOI : 10.3390/e10040745 | |
来源: mdpi | |
【 摘 要 】
At the heart of many ICA techniques is a nonparametric estimate of an information measure, usually via nonparametric density estimation, for example, kernel density estimation. While not as popular as kernel density estimators, orthogonal functions can be used for nonparametric density estimation (via a truncated series expansion whose coefficients are calculated from the observed data). While such estimators do not necessarily yield a valid density, which kernel density estimators do, they are faster to calculate than kernel density estimators, in particular for a modified version of Renyi’s entropy of order 2. In this paper, we compare the performance of ICA using Hermite series based estimates of Shannon’s and Renyi’s mutual information, to that of Gaussian kernel based estimates. The comparisons also include ICA using the RADICAL estimate of Shannon’s entropy and a FastICA estimate of neg-entropy.
【 授权许可】
CC BY
© 2008 by the authors; licensee Molecular Diversity Preservation International, Basel, Switzerland.
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
RO202003190057919ZK.pdf | 220KB | download |