期刊论文详细信息
NEUROCOMPUTING 卷:211
Building support vector machines in the context of regularized least squares
Article
Peng, Jian-Xun1  Rafferty, Karen1  Ferguson, Stuart1 
[1] Queens Univ Belfast, Sch Elect Elect Engn & Comp Sci, Ashby Bldg,Stranmillis Rd, Belfast BT9 5AH, Antrim, North Ireland
关键词: Data classification;    Support vector machines;    Regularized least squares;    Fast training algorithm;    Cholesky decomposition;   
DOI  :  10.1016/j.neucom.2016.03.087
来源: Elsevier
PDF
【 摘 要 】

This paper formulates a linear kernel support vector machine (SVM) as a regularized least-squares (RLS) problem. By defining a set of indicator variables of the errors, the solution to the RLS problem is represented as an equation that relates the error vector to the indicator variables. Through partitioning the training set, the SVM weights and bias are expressed analytically using the support vectors. It is also shown how this approach naturally extends to sums with nonlinear kernels whilst avoiding the need to make use of Lagrange multipliers and duality theory. A fast iterative solution algorithm based on Cholesky decomposition with permutation of the support vectors is suggested as a solution method. The properties of our SVM formulation are analyzed and compared with standard SVMs using a simple example that can be illustrated graphically. The correctness and behavior of our solution (merely derived in the primal context of RLS) is demonstrated using a set of public benchmarking problems for both linear and nonlinear SVMs. (C) 2016 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2016_03_087.pdf 849KB PDF download
  文献评价指标  
  下载次数:6次 浏览次数:0次