期刊论文详细信息
NEUROCOMPUTING 卷:169
Asynchronous gossip principal components analysis
Article; Proceedings Paper
Fellus, Jerome1  Picard, David1  Gosselin, Philippe-Henri1 
[1] ETIS UMR CNRS 8051 ENSEA Univ Cergy Pontoise, F-95014 Cergy, France
关键词: Distributed machine learning;    Dimensionality reduction;    Gossip protocols;   
DOI  :  10.1016/j.neucom.2014.11.076
来源: Elsevier
PDF
【 摘 要 】

This paper deals with Principal Components Analysis (PCA) of data spread over a network where central coordination and synchronous communication between networking nodes are forbidden. We propose an asynchronous and decentralized PCA algorithm dedicated to large scale problems, where large simultaneously applies to dimensionality, number of observations and network size. It is based on the integration of a dimension reduction step into a gossip consensus protocol. Unlike other approaches, a straightforward dual formulation makes it suitable when observed dimensions are distributed. We theoretically show its equivalence with a centralized PCA under a low-rank assumption on training data. An experimental analysis reveals that it achieves a good accuracy with a reasonable communication cost even when the low-rank assumption is relaxed. (C) 2015 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2014_11_076.pdf 847KB PDF download
  文献评价指标  
  下载次数:5次 浏览次数:0次