期刊论文详细信息
NEUROCOMPUTING 卷:363
Improved fixed-rank Nystrom approximation via QR decomposition: Practical and theoretical aspects
Article
Pourkamali-Anaraki, Farhad1  Becker, Stephen2 
[1] Univ Massachusetts, Dept Comp Sci, Lowell, MA 01854 USA
[2] Univ Colorado, Dept Appl Math, Boulder, CO 80309 USA
关键词: Kernel methods;    Nystrom approximation;    Matrix factorization;   
DOI  :  10.1016/j.neucom.2019.06.070
来源: Elsevier
PDF
【 摘 要 】

The Nystrom method is a popular technique that uses a small number of landmark points to compute a fixed-rank approximation of large kernel matrices that arise in machine learning problems. In practice, to ensure high quality approximations, the number of landmark points is chosen to be greater than the target rank. However, for simplicity the standard Nystrom method uses a sub-optimal procedure for rank reduction. In this paper, we examine the drawbacks of the standard Nystrom method in terms of poor performance and lack of theoretical guarantees. To address these issues, we present an efficient modification for generating improved fixed-rank Nystrom approximations. Theoretical analysis and numerical experiments are provided to demonstrate the advantages of the modified method over the standard Nystrom method. Overall, the aim of this paper is to convince researchers to use the modified method, as it has nearly identical computational complexity, is easy to code, has greatly improved accuracy in many cases, and is optimal in a sense that we make precise. (C) 2019 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2019_06_070.pdf 1391KB PDF download
  文献评价指标  
  下载次数:1次 浏览次数:0次