期刊论文详细信息
NEUROCOMPUTING 卷:69
Local regularization assisted orthogonal least squares regression
Article
Chen, S
关键词: orthogonal least squares algorithm;    regularization;    regression;    Bayesian learning;    relevance vector machines;    evidence procedure;   
DOI  :  10.1016/j.neucom.2004.12.011
来源: Elsevier
PDF
【 摘 要 】

A locally regularized orthogonal least squares (LROLS) algorithm is proposed for constructing parsimonious or sparse regression models that generalize well. By associating each orthogonal weight in the regression model with an individual regularization parameter, the ability for the orthogonal least squares model selection to produce a very sparse model with good generalization performance is greatly enhanced. Furthermore, with the assistance of local regularization, when to terminate the model selection procedure becomes much clearer. A comparison with a state-of-the-art method for constructing sparse regression models, known as the relevance vector machine, is given. The proposed LROLS algorithm is shown to possess considerable computational advantages, including well conditioned solution and faster convergence speed. (c) 2005 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2004_12_011.pdf 361KB PDF download
  文献评价指标  
  下载次数:3次 浏览次数:0次