期刊论文详细信息
JOURNAL OF APPROXIMATION THEORY 卷:164
Support vector machines regression with l1-regularizer
Article
Tong, Hongzhi1  Chen, Di-Rong2,3  Yang, Fenghong4 
[1] Univ Int Business & Econ, Sch Informat Technol & Management, Beijing 100029, Peoples R China
[2] Beijing Univ Aeronaut & Astronaut, Dept Math, Beijing 100083, Peoples R China
[3] Beijing Univ Aeronaut & Astronaut, LMIB, Beijing 100083, Peoples R China
[4] Cent Univ Finance & Econ, Sch Appl Math, Beijing 100081, Peoples R China
关键词: Support vector machines regression;    Coefficient regularization;    Learning rate;    Reproducing kernel Hilbert spaces;    Error decomposition;   
DOI  :  10.1016/j.jat.2012.06.005
来源: Elsevier
PDF
【 摘 要 】

The classical support vector machines regression (SVMR) is known as a regularized learning algorithm in reproducing kernel Hilbert spaces (RKHS) with a epsilon-insensitive loss function and an RKHS norm regularizer. In this paper, we study a new SVMR algorithm where the regularization term is proportional to l(1)-norm of the coefficients in the kernel ensembles. We provide an error analysis of this algorithm, an explicit learning rate is then derived under some assumptions. (c) 2012 Elsevier Inc. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_jat_2012_06_005.pdf 227KB PDF download
  文献评价指标  
  下载次数:0次 浏览次数:0次