Parameter learning and support vector reduction in support vector regression
碩士 === 國立中山大學 === 電機工程學系研究所 === 94 === The selection and learning of kernel functions is a very important but rarely studied problem in the field of support vector learning. However, the kernel function of a support vector regression has great influence on its performance. The kernel function projec...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | zh-TW |
Published: |
2006
|
Online Access: | http://ndltd.ncl.edu.tw/handle/11288181706062580666 |
id |
ndltd-TW-094NSYS5442088 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-094NSYS54420882016-05-27T04:18:10Z http://ndltd.ncl.edu.tw/handle/11288181706062580666 Parameter learning and support vector reduction in support vector regression 支援向量迴歸方法中的參數學習與支援向量點的縮減 Chih-cheng Yang 楊智程 碩士 國立中山大學 電機工程學系研究所 94 The selection and learning of kernel functions is a very important but rarely studied problem in the field of support vector learning. However, the kernel function of a support vector regression has great influence on its performance. The kernel function projects the dataset from the original data space into the feature space, and therefore the problems which can not be done in low dimensions could be done in a higher dimension through the transform of the kernel function. In this paper, there are two main contributions. Firstly, we introduce the gradient descent method to the learning of kernel functions. Using the gradient descent method, we can conduct learning rules of the parameters which indicate the shape and distribution of the kernel functions. Therefore, we can obtain better kernel functions by training their parameters with respect to the risk minimization principle. Secondly, In order to reduce the number of support vectors, we use the orthogonal least squares method. By choosing the representative support vectors, we may remove the less important support vectors in the support vector regression model. The experimental results have shown that our approach can derive better kernel functions than others and has better generalization ability. Also, the number of support vectors can be effectively reduced. Shie-jue Lee 李錫智 2006 學位論文 ; thesis 72 zh-TW |
collection |
NDLTD |
language |
zh-TW |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立中山大學 === 電機工程學系研究所 === 94 === The selection and learning of kernel functions is a very important but rarely studied problem in the field of support vector learning. However, the kernel function of a support vector regression has great influence on its performance. The kernel function projects the dataset from the original data space into the feature space, and therefore the problems which can not be done in low dimensions could be done in a higher dimension through the transform of the kernel function.
In this paper, there are two main contributions. Firstly, we introduce the gradient descent method to the learning of kernel functions. Using the gradient descent method, we can conduct learning rules of the parameters which indicate the shape and distribution of the kernel functions. Therefore, we can obtain better kernel functions by training their parameters with respect to the risk minimization principle. Secondly, In order to reduce the number of support vectors, we use the orthogonal least squares method. By choosing the representative support vectors, we may remove the less important support vectors in the support vector regression model.
The experimental results have shown that our approach can derive better kernel functions than others and has better generalization ability. Also, the number of support vectors can be effectively reduced.
|
author2 |
Shie-jue Lee |
author_facet |
Shie-jue Lee Chih-cheng Yang 楊智程 |
author |
Chih-cheng Yang 楊智程 |
spellingShingle |
Chih-cheng Yang 楊智程 Parameter learning and support vector reduction in support vector regression |
author_sort |
Chih-cheng Yang |
title |
Parameter learning and support vector reduction in support vector regression |
title_short |
Parameter learning and support vector reduction in support vector regression |
title_full |
Parameter learning and support vector reduction in support vector regression |
title_fullStr |
Parameter learning and support vector reduction in support vector regression |
title_full_unstemmed |
Parameter learning and support vector reduction in support vector regression |
title_sort |
parameter learning and support vector reduction in support vector regression |
publishDate |
2006 |
url |
http://ndltd.ncl.edu.tw/handle/11288181706062580666 |
work_keys_str_mv |
AT chihchengyang parameterlearningandsupportvectorreductioninsupportvectorregression AT yángzhìchéng parameterlearningandsupportvectorreductioninsupportvectorregression AT chihchengyang zhīyuánxiàngliànghuíguīfāngfǎzhōngdecānshùxuéxíyǔzhīyuánxiàngliàngdiǎndesuōjiǎn AT yángzhìchéng zhīyuánxiàngliànghuíguīfāngfǎzhōngdecānshùxuéxíyǔzhīyuánxiàngliàngdiǎndesuōjiǎn |
_version_ |
1718282157507477504 |