Robust Smooth Support Vector Machine Learning
博士 === 國立臺灣科技大學 === 資訊工程系 === 98 === This dissertation proposes four robust smooth support vector machine learning methodologies. First, we propose a new approach to generate representative reduced set for RSVM. Clustering reduced support vector machine (CRSVM) generates cluster centroids of each cl...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2010
|
Online Access: | http://ndltd.ncl.edu.tw/handle/81221681928172170409 |
id |
ndltd-TW-098NTUS5392079 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-098NTUS53920792016-04-22T04:23:48Z http://ndltd.ncl.edu.tw/handle/81221681928172170409 Robust Smooth Support Vector Machine Learning 穩健平滑支撐向量機學習機制之研究 Li-Jen Chien 簡立仁 博士 國立臺灣科技大學 資訊工程系 98 This dissertation proposes four robust smooth support vector machine learning methodologies. First, we propose a new approach to generate representative reduced set for RSVM. Clustering reduced support vector machine (CRSVM) generates cluster centroids of each class and uses them to form the reduced set. By estimating the approximate density for each cluster, we can compute the width parameter used in Gaussian kernel. Secondly, we modify the previous 2-norm soft margin smooth support vector machine (SSVM2) to propose a new 1-norm soft margin smooth support vector machine (SSVM1). We also propose a heuristic method of outlier filtering for SSVMs which costs little in training process and improves the ability of outlier resistance a lot. Thirdly, we introduce the smooth technique into 1-norm SVM and call it smooth LASSO for classification (SLASSO). It can provide simultaneous classification and feature selection. Results showed that SLASSO has slightly better accuracy than other approaches with the desirable ability of feature suppression. In the end of this dissertation, we implement a ternary SSVM (TSSVM) and use it to design a novel multiclass classification scheme, one-vs.-one-vs.-rest (OOR). It decomposes the problem into a series of k(k-1)/2 ternary classification subproblems. Results show that TSSVM/OOR performs better than one-vs.-one and one-vs.-rest. We also find out that the prediction confidence of OOR is significantly higher than the one-vs.-one scheme. Due to the nature of OOR design, it can be applied to detect the hidden (unknown) class directly. We conduct a "leave-one-class-out" experiment on the pendigits dataset which shows that OOR outperforms the one-vs.-one and one-vs.-rest in the hidden class detection rate. Yuh-Jye Lee 李育杰 2010 學位論文 ; thesis 90 en_US |
collection |
NDLTD |
language |
en_US |
format |
Others
|
sources |
NDLTD |
description |
博士 === 國立臺灣科技大學 === 資訊工程系 === 98 === This dissertation proposes four robust smooth support vector machine learning methodologies. First, we propose a new approach to generate representative reduced set for RSVM. Clustering reduced support vector machine (CRSVM) generates cluster centroids of each class and uses them to form the reduced set. By estimating the approximate density for each cluster, we can compute the width parameter used in Gaussian kernel. Secondly, we modify the previous 2-norm soft margin smooth support vector machine (SSVM2) to propose a new 1-norm soft margin smooth support vector machine (SSVM1). We also propose a heuristic method of outlier filtering for SSVMs which costs little in training process and improves the ability of outlier resistance a lot. Thirdly, we introduce the smooth technique into 1-norm SVM and call it smooth LASSO for classification (SLASSO). It can provide simultaneous classification and feature selection. Results showed that SLASSO has slightly better accuracy than other
approaches with the desirable ability of feature suppression. In the end of this dissertation, we implement a ternary SSVM (TSSVM) and use it to design a novel multiclass classification scheme, one-vs.-one-vs.-rest (OOR). It decomposes the problem into a series of k(k-1)/2 ternary classification subproblems. Results show that TSSVM/OOR performs better than one-vs.-one and one-vs.-rest. We also find out that the prediction confidence of OOR is significantly higher than the one-vs.-one scheme. Due to the nature of OOR design, it can be applied to detect the hidden (unknown) class directly. We conduct a "leave-one-class-out" experiment on the pendigits dataset which shows that OOR outperforms the one-vs.-one and one-vs.-rest in the hidden class detection rate.
|
author2 |
Yuh-Jye Lee |
author_facet |
Yuh-Jye Lee Li-Jen Chien 簡立仁 |
author |
Li-Jen Chien 簡立仁 |
spellingShingle |
Li-Jen Chien 簡立仁 Robust Smooth Support Vector Machine Learning |
author_sort |
Li-Jen Chien |
title |
Robust Smooth Support Vector Machine Learning |
title_short |
Robust Smooth Support Vector Machine Learning |
title_full |
Robust Smooth Support Vector Machine Learning |
title_fullStr |
Robust Smooth Support Vector Machine Learning |
title_full_unstemmed |
Robust Smooth Support Vector Machine Learning |
title_sort |
robust smooth support vector machine learning |
publishDate |
2010 |
url |
http://ndltd.ncl.edu.tw/handle/81221681928172170409 |
work_keys_str_mv |
AT lijenchien robustsmoothsupportvectormachinelearning AT jiǎnlìrén robustsmoothsupportvectormachinelearning AT lijenchien wěnjiànpínghuázhīchēngxiàngliàngjīxuéxíjīzhìzhīyánjiū AT jiǎnlìrén wěnjiànpínghuázhīchēngxiàngliàngjīxuéxíjīzhìzhīyánjiū |
_version_ |
1718231246342979584 |