Study on Least Trimmed Squares Artificial Neural Networks

碩士 === 國立中山大學 === 電機工程學系研究所 === 96 === In this thesis, we study the least trimmed squares artificial neural networks (LTS-ANNs), which are generalization of the least trimmed squares (LTS) estimators frequently used in robust linear parametric regression problems to nonparametric artificial neural n...

Full description

Bibliographic Details
Main Authors: Wen-Chin Cheng, 鄭文欽
Other Authors: Jer-Guang Hsieh
Format: Others
Language:en_US
Published: 2008
Online Access:http://ndltd.ncl.edu.tw/handle/uku536
id ndltd-TW-096NSYS5442045
record_format oai_dc
spelling ndltd-TW-096NSYS54420452018-05-20T04:35:25Z http://ndltd.ncl.edu.tw/handle/uku536 Study on Least Trimmed Squares Artificial Neural Networks 最小截尾平方類神經網路之研究 Wen-Chin Cheng 鄭文欽 碩士 國立中山大學 電機工程學系研究所 96 In this thesis, we study the least trimmed squares artificial neural networks (LTS-ANNs), which are generalization of the least trimmed squares (LTS) estimators frequently used in robust linear parametric regression problems to nonparametric artificial neural networks (ANNs) used for nonlinear regression problems. Two training algorithms are proposed in this thesis. The first algorithm is the incremental gradient descent algorithm. In order to speed up the convergence, the second training algorithm is proposed based on recursive least squares (RLS). Three illustrative examples are provided to test the performances of robustness against outliers for the classical ANNs and the LTS-ANNs. Simulation results show that upon proper selection of the trimming constant of the learning machines, LTS-ANNs are quite robust against outliers compared with the classical ANNs. Jer-Guang Hsieh 謝哲光 2008 學位論文 ; thesis 54 en_US
collection NDLTD
language en_US
format Others
sources NDLTD
description 碩士 === 國立中山大學 === 電機工程學系研究所 === 96 === In this thesis, we study the least trimmed squares artificial neural networks (LTS-ANNs), which are generalization of the least trimmed squares (LTS) estimators frequently used in robust linear parametric regression problems to nonparametric artificial neural networks (ANNs) used for nonlinear regression problems. Two training algorithms are proposed in this thesis. The first algorithm is the incremental gradient descent algorithm. In order to speed up the convergence, the second training algorithm is proposed based on recursive least squares (RLS). Three illustrative examples are provided to test the performances of robustness against outliers for the classical ANNs and the LTS-ANNs. Simulation results show that upon proper selection of the trimming constant of the learning machines, LTS-ANNs are quite robust against outliers compared with the classical ANNs.
author2 Jer-Guang Hsieh
author_facet Jer-Guang Hsieh
Wen-Chin Cheng
鄭文欽
author Wen-Chin Cheng
鄭文欽
spellingShingle Wen-Chin Cheng
鄭文欽
Study on Least Trimmed Squares Artificial Neural Networks
author_sort Wen-Chin Cheng
title Study on Least Trimmed Squares Artificial Neural Networks
title_short Study on Least Trimmed Squares Artificial Neural Networks
title_full Study on Least Trimmed Squares Artificial Neural Networks
title_fullStr Study on Least Trimmed Squares Artificial Neural Networks
title_full_unstemmed Study on Least Trimmed Squares Artificial Neural Networks
title_sort study on least trimmed squares artificial neural networks
publishDate 2008
url http://ndltd.ncl.edu.tw/handle/uku536
work_keys_str_mv AT wenchincheng studyonleasttrimmedsquaresartificialneuralnetworks
AT zhèngwénqīn studyonleasttrimmedsquaresartificialneuralnetworks
AT wenchincheng zuìxiǎojiéwěipíngfānglèishénjīngwǎnglùzhīyánjiū
AT zhèngwénqīn zuìxiǎojiéwěipíngfānglèishénjīngwǎnglùzhīyánjiū
_version_ 1718640783924396032