会议论文详细信息
13th International Conference on Artificial Intelligence and Statistics
Kernel Partial Least Squares is Universally Consistent
计算机科学;数学科学
Gilles Blanchard Nicole Kra¨mer
PID  :  119805
来源: CEUR
PDF
【 摘 要 】

We prove the statistical consistency of ker nel Partial Least Squares Regression applied to a bounded regression learning problem on a reproducing kernel Hilbert space. Partial Least Squares stands out of wellknown clas sical approaches as e.g. Ridge Regression or Principal Components Regression, as it is not defined as the solution of a global cost mini mization procedure over a fixed model nor is it a linear estimator. Instead, approximate solutions are constructed by projections onto a nested set of datadependent subspaces. To prove consistency, we exploit the known fact that Partial Least Squares is equivalent to the conjugate gradient algorithm in combina tion with early stopping. The choice of the stopping rule (number of iterations) is a cru cial point. We study two empirical stopping rules. The first one monitors the estimation error in each iteration step of Partial Least Squares, and the second one estimates the empirical complexity in terms of a condition number. Both stopping rules lead to univer sally consistent estimators provided the ker

【 预 览 】
附件列表
Files Size Format View
Kernel Partial Least Squares is Universally Consistent 887KB PDF download
  文献评价指标  
  下载次数:49次 浏览次数:51次