NEUROCOMPUTING | 卷:71 |
Mixtures of robust probabilistic principal component analyzers | |
Article; Proceedings Paper | |
Archambeau, Cedric1  Delannay, Nicolas2  Verleysen, Michel2  | |
[1] UCL, Ctr Computat Stat & Machine Learning, London WC1E 6BT, England | |
[2] Catholic Univ Louvain, Machine Learning Grp, B-1348 Louvain, Belgium | |
关键词: mixture model; principal component analysis; dimensionality reduction; robustness to outliers; non-Gaussianity; EM algorithm; | |
DOI : 10.1016/j.neucom.2007.11.029 | |
来源: Elsevier | |
【 摘 要 】
Mixtures of probabilistic principal component analyzers model high-dimensional nonlinear data by combining local linear models. Each mixture component is specifically designed to extract the local principal orientations in the data. An important issue with this generative model is its sensitivity to data lying off the low-dimensional manifold. In order to address this problem, the mixtures of robust probabilistic principal component analyzers are introduced. They take care of atypical points by means of a long tail distribution, the Student-t. It is shown that the resulting mixture model is an extension of the mixture of Gaussians, suitable for both robust clustering and dimensionality reduction. Finally, we briefly discuss how to construct a robust version of the closely related mixture of factor analyzers. (c) 2008 Elsevier B.V. All rights reserved.
【 授权许可】
Free
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
10_1016_j_neucom_2007_11_029.pdf | 644KB | download |