S-Space College of Natural Sciences (자연과학대학) Dept. of Statistics (통계학과) Theses (Master's Degree_통계학과)
Optimization Methods for SCAD-penalized Support Vector Machine
SCAD-벌점화 지지벡터기계 모형에 대한 최적화 방법들
- 자연과학대학 통계학과
- Issue Date
- 서울대학교 대학원
- Local approximation algorithm; Smoothly clipped absolute deviation penalty; Support vector machine; Variable selection; Initialization
- 학위논문 (석사)-- 서울대학교 대학원 : 자연과학대학 통계학과, 2018. 2. 원중호.
- The support vector machine (SVM) is a powerful tool for binary classification problem, but it is adversely affected when redundant variables are involved. Several variants of the SVM have been proposed to rectify this problem. Among them, the smoothly clipped absolute deviation penalized SVM (SCAD SVM) has been proven to perform effective variable selection. However, issues regarding nonconvexity
and multiple local minimums are evident in the process of optimization. This paper summarizes the local quadratic approximation (LQA) and the local
linear approximation (LLA) methods, which are primary optimization methods for the SCAD SVM, and further brings two new approaches. First, the envelope method is applied in the derivation of each algorithm instead of the usual Taylor series expansion, which is a more generalized method for the derivation than the conventional one. Next, in addition to the previously known limitations
of the LQA method and the comparative advantages of the LLA method, we suggest the insensitivity to initial value of the LLA method and present theories about the convergence of the LLA algorithm to the oracle estimator for arbitrary initial value. Lastly, we verify through a simulation study that the LLA method gives better results for any initial values than the LQA method.