Large-scale Linear Support Vector Regression

碩士 === 國立臺灣大學 === 資訊工程學研究所 === 100 === Support vector regression (SVR) and support vector classification (SVC) are popular learning techniques, but their use with kernels is often time consuming. Recently, linear SVC without kernels has been shown to give competitive accuracy for some applications,...

Full description

Bibliographic Details
Main Authors: Chia-Hua Ho, 何家華
Other Authors: Chih-Jen Lin
Format: Others
Language:en_US
Published: 2012
Online Access:http://ndltd.ncl.edu.tw/handle/90643485073745704762
Description
Summary:碩士 === 國立臺灣大學 === 資訊工程學研究所 === 100 === Support vector regression (SVR) and support vector classification (SVC) are popular learning techniques, but their use with kernels is often time consuming. Recently, linear SVC without kernels has been shown to give competitive accuracy for some applications, but enjoys much faster training/testing. However, few studies have focused on linear SVR. In this thesis, we extend state-of-the-art training methods for linear SVC to linear SVR. We show that the extension is straightforward for some methods, but is not trivial for some others. Our experiments demonstrate that for some problems, the proposed linear-SVR training methods can very efficiently produce models that are as good as kernel SVR.