期刊论文详细信息
NEUROCOMPUTING 卷:64
Building sparse representations and structure determination on LS-SVM substrates
Article; Proceedings Paper
Pelckmans, K ; Suykens, JAK ; De Moor, B
关键词: least-squares support vector machines;    regularization;    structure detection;    model selection;    convex optimization;    sparseness;   
DOI  :  10.1016/j.neucom.2004.11.029
来源: Elsevier
PDF
【 摘 要 】

This paper proposes a new method to obtain sparseness and structure detection for a class of kernel machines related to least-squares support vector machines (LS-SVMs). The key method is to adopt an hierarchical modeling strategy. Here, the first level consists of an LS-SVM substrate which is based upon an LS-SVM formulation with additive regularization trade-off. This regularization trade-off is determined at higher levels such that sparse representations and/or structure detection are obtained. Using the necessary and sufficient conditions for optimality given by the Karush-Kuhn-Tucker conditions, one can guide the interaction between different levels via a well-defined set of hyper-parameters. From a computational point of view, all levels can be fused into a single convex optimization problem. Furthermore, the principle is applied in order to optimize the validation performance of the resulting kernel machine. Sparse representations as well as structure detection are obtained, respectively, by using an L, regularization scheme and a measure of maximal variation at the second level. A number of case studies indicate the usefulness of these approaches both with respect to interpretability of the final model as well as generalization performance. (c) 2005 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2004_11_029.pdf 749KB PDF download
  文献评价指标  
  下载次数:1次 浏览次数:0次