Low-degree Polynomial Mapping of Data for SVM

碩士 === 國立臺灣大學 === 資訊工程學研究所 === 97 === Non-linear mapping functions have long been used in SVM to transform data into a higher dimensional space, allowing the classifier to separate non-linearly distributed data instances. Kernel tricks are used to avoid the problem of a huge number of features of th...

Full description

Bibliographic Details
Main Authors: Yin-Wen Chang, 張瀠文
Other Authors: Chih-Jen Lin
Format: Others
Language:en_US
Published: 2009
Online Access:http://ndltd.ncl.edu.tw/handle/85416613440796675619
Description
Summary:碩士 === 國立臺灣大學 === 資訊工程學研究所 === 97 === Non-linear mapping functions have long been used in SVM to transform data into a higher dimensional space, allowing the classifier to separate non-linearly distributed data instances. Kernel tricks are used to avoid the problem of a huge number of features of the mapped data point. However, the training/testing for large data is often time consuming. Following the recent advances in training large linear SVM (i.e., SVM without using nonlinear kernels), this work proposes a method that strikes a balance between the training/testing speed and the testing accuracy. We apply the fast training method for linear SVM to the expanded form of data under low-degree polynomial mappings. The method enjoys the fast training/testing, but may achieve testing accuracy close to that of using highly nonlinear kernels. Empirical experiments show that the proposed method is useful for certain large-scale data sets.