The support vector machine (SVM) is a powerful tool for binary classification problem, but it is adversely affected when redundant variables are involved. Several variants of the SVM have been proposed to rectify this problem. Among them, the smoothly clipped absolute deviation penalized SVM (SCAD SVM) has been proven to perform effective variable selection. However, issues regarding nonconvexityand multiple local minimums are evident in the process of optimization. This paper summarizes the local quadratic approximation (LQA) and the locallinear approximation (LLA) methods, which are primary optimization methods for the SCAD SVM, and further brings two new approaches. First, the envelope method is applied in the derivation of each algorithm instead of the usual Taylor series expansion, which is a more generalized method for the derivation than the conventional one. Next, in addition to the previously known limitationsof the LQA method and the comparative advantages of the LLA method, we suggest the insensitivity to initial value of the LLA method and present theories about the convergence of the LLA algorithm to the oracle estimator for arbitrary initial value. Lastly, we verify through a simulation study that the LLA method gives better results for any initial values than the LQA method.
【 预 览 】
附件列表
Files
Size
Format
View
Optimization Methods for SCAD-penalized Support Vector Machine