期刊论文详细信息
NEUROCOMPUTING 卷:390
Competitive regularised regression
Article
Jamil, Waqas1  Bouchachia, Abdelhamid1 
[1] Bournemouth Univ, Dept Comp & Informat, Poole, Dorset, England
关键词: Regression;    Regularisation;    Online learning;    Competitive analysis;   
DOI  :  10.1016/j.neucom.2019.08.094
来源: Elsevier
PDF
【 摘 要 】

Regularised regression uses sparsity and variance to reduce the complexity and over-fitting of a regression model. The present paper introduces two novel regularised linear regression algorithms: Competitive Iterative Ridge Regression (CIRR) and Online Shrinkage via Limit of Gibbs Sampler (OSLOG) for fast and reliable prediction on Big Data without making distributional assumption on the data. We use the technique of competitive analysis to design them and show their strong theoretical guarantee. Furthermore, we compare their performance against some neoteric regularised regression methods such as Online Ridge Regression (ORR) and the Aggregating Algorithm for Regression (AAR). The comparison of the algorithms is done theoretically, focusing on the guarantee on the performance on cumulative loss, and empirically to show the advantages of CIRR and OSLOG. (C) 2019 Published by Elsevier B.V.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2019_08_094.pdf 578KB PDF download
  文献评价指标  
  下载次数:1次 浏览次数:0次