We prove the statistical consistency of ker nel Partial Least Squares Regression applied to a bounded regression learning problem on a reproducing kernel Hilbert space. Partial Least Squares stands out of wellknown clas sical approaches as e.g. Ridge Regression or Principal Components Regression, as it is not defined as the solution of a global cost mini mization procedure over a fixed model nor is it a linear estimator. Instead, approximate solutions are constructed by projections onto a nested set of datadependent subspaces. To prove consistency, we exploit the known fact that Partial Least Squares is equivalent to the conjugate gradient algorithm in combina tion with early stopping. The choice of the stopping rule (number of iterations) is a cru cial point. We study two empirical stopping rules. The first one monitors the estimation error in each iteration step of Partial Least Squares, and the second one estimates the empirical complexity in terms of a condition number. Both stopping rules lead to univer sally consistent estimators provided the ker
【 预 览 】
附件列表
Files
Size
Format
View
Kernel Partial Least Squares is Universally Consistent