学位论文详细信息
Optimization Methods for SCAD-penalized Support Vector Machine
Local approximation algorithm;Smoothly clipped absolute deviation penalty;Support vector machine;Variable selection;Initialization;519.5
자연과학대학 통계학과 ;
University:서울대학교 대학원
关键词: Local approximation algorithm;    Smoothly clipped absolute deviation penalty;    Support vector machine;    Variable selection;    Initialization;    519.5;   
Others  :  http://s-space.snu.ac.kr/bitstream/10371/142472/1/000000150000.pdf
美国|英语
来源: Seoul National University Open Repository
PDF
【 摘 要 】

The support vector machine (SVM) is a powerful tool for binary classification problem, but it is adversely affected when redundant variables are involved. Several variants of the SVM have been proposed to rectify this problem. Among them, the smoothly clipped absolute deviation penalized SVM (SCAD SVM) has been proven to perform effective variable selection. However, issues regarding nonconvexityand multiple local minimums are evident in the process of optimization. This paper summarizes the local quadratic approximation (LQA) and the locallinear approximation (LLA) methods, which are primary optimization methods for the SCAD SVM, and further brings two new approaches. First, the envelope method is applied in the derivation of each algorithm instead of the usual Taylor series expansion, which is a more generalized method for the derivation than the conventional one. Next, in addition to the previously known limitationsof the LQA method and the comparative advantages of the LLA method, we suggest the insensitivity to initial value of the LLA method and present theories about the convergence of the LLA algorithm to the oracle estimator for arbitrary initial value. Lastly, we verify through a simulation study that the LLA method gives better results for any initial values than the LQA method.

【 预 览 】
附件列表
Files Size Format View
Optimization Methods for SCAD-penalized Support Vector Machine 2401KB PDF download
  文献评价指标  
  下载次数:12次 浏览次数:15次