| NEUROCOMPUTING | 卷:249 |
| Error analysis of regularized least-square regression with Fredholm kernel | |
| Article | |
| Tao, Yanfang1,4  Yuan, Peipei2  Song, Biqin3  | |
| [1] Hubei Univ, Fac Math & Stat, Wuhan 430062, Peoples R China | |
| [2] Huazhong Agr Univ, Coll Engn, Wuhan 430070, Peoples R China | |
| [3] Huazhong Agr Univ, Coll Sci, Wuhan 430070, Peoples R China | |
| [4] Changjiang Polytech, Dept Basic Courses, Wuhan 430074, Peoples R China | |
| 关键词: Fredholm learning; Generalization bound; Learning rate; Data dependent hypothesis spaces; | |
| DOI : 10.1016/j.neucom.2017.03.076 | |
| 来源: Elsevier | |
PDF
|
|
【 摘 要 】
Learning with Fredholm kernel has attracted increasing attention recently since it can effectively utilize the data information to improve the prediction performance. Despite rapid progress on theoretical and experimental evaluations, its generalization analysis has not been explored in learning theory literature. In this paper, we establish the generalization bound of least square regularized regression with Fred holm kernel, which implies that the fast learning rate O(l(-1)) can be reached under mild conditions (l is the number of labeled samples). Simulated examples show that this Fredholm regression algorithm can achieve the satisfactory prediction performance. (C) 2017 Elsevier B.V. All rights reserved.
【 授权许可】
Free
【 预 览 】
| Files | Size | Format | View |
|---|---|---|---|
| 10_1016_j_neucom_2017_03_076.pdf | 772KB |
PDF