Publications

Detailed Information

Optimization Methods for SCAD-penalized Support Vector Machine : SCAD-벌점화 지지벡터기계 모형에 대한 최적화 방법들

Cited 0 time in Web of Science Cited 0 time in Scopus
Authors

이한별

Advisor
원중호
Major
자연과학대학 통계학과
Issue Date
2018-02
Publisher
서울대학교 대학원
Keywords
Local approximation algorithmSmoothly clipped absolute deviation penaltySupport vector machineVariable selectionInitialization
Description
학위논문 (석사)-- 서울대학교 대학원 : 자연과학대학 통계학과, 2018. 2. 원중호.
Abstract
The support vector machine (SVM) is a powerful tool for binary classification problem, but it is adversely affected when redundant variables are involved. Several variants of the SVM have been proposed to rectify this problem. Among them, the smoothly clipped absolute deviation penalized SVM (SCAD SVM) has been proven to perform effective variable selection. However, issues regarding nonconvexity
and multiple local minimums are evident in the process of optimization. This paper summarizes the local quadratic approximation (LQA) and the local
linear approximation (LLA) methods, which are primary optimization methods for the SCAD SVM, and further brings two new approaches. First, the envelope method is applied in the derivation of each algorithm instead of the usual Taylor series expansion, which is a more generalized method for the derivation than the conventional one. Next, in addition to the previously known limitations
of the LQA method and the comparative advantages of the LLA method, we suggest the insensitivity to initial value of the LLA method and present theories about the convergence of the LLA algorithm to the oracle estimator for arbitrary initial value. Lastly, we verify through a simulation study that the LLA method gives better results for any initial values than the LQA method.
Language
English
URI
https://hdl.handle.net/10371/142472
Files in This Item:
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share