Neural Network Training Using Simplified Hybrid Nelder-Mead and Particle Swarm Optimization

博士 === 國立交通大學 === 電控工程研究所 === 103 === Neural network training using simplified hybrid Nelder and Mead (NM) and particle swarm optimization (PSO) are investigated in this dissertation, which consists of three parts. In the first part, a new and simplified hybrid algorithm mixing the simplex method of...

Full description

Bibliographic Details
Main Authors: Liao, Shih-Hui, 廖時慧
Other Authors: Chang, Jyh-Yeong
Format: Others
Language:en_US
Published: 2014
Online Access:http://ndltd.ncl.edu.tw/handle/88226115336437697990
Description
Summary:博士 === 國立交通大學 === 電控工程研究所 === 103 === Neural network training using simplified hybrid Nelder and Mead (NM) and particle swarm optimization (PSO) are investigated in this dissertation, which consists of three parts. In the first part, a new and simplified hybrid algorithm mixing the simplex method of Nelder and Mead and particle swarm optimization algorithm, abbreviated as SNM-PSO, is proposed for the training of the artificial neural networks (ANNs). Our proposed method is simpler than other similar hybrid PSO methods and puts more emphasis on exploration of the search space. Some simulation results show that the performance of the proposed method outperforms that of other similar hybrid methods. In the second part of the dissertation, we propose the adaptive least trimmed squares fuzzy neural networks (ALTS-FNNs), which are generalizations of the linear least trimmed squares (LTS) estimators, to deal with data sets with outliers. In our method, the important parameter, i.e., the trimming percentage or trimming constant, is automatically determined by the data. The incremental gradient descent, PSO, and SNM-PSO algorithms are used to train the ALTS-FNNs for some numerical examples and performance comparisons are made. The proposed ALTS-FNNs are usually more robust against outliers in the data than the usual LTS-FNNs with arbitrarily specified trimming percentages. In the last part of the dissertation, we propose a new class of learning models, namely the additive artificial neural networks (AANNs), for general nonlinear regression problems. This class of learning machines combines the artificial neural networks and the additive models (AMs) frequently encountered in semiparametric regression problems. Specifically, the AMs are embedded in the output layer of the ANNs to form the AANNs. The proposed SNM-PSO method will be utilized for estimating the parameters in the first layer of the proposed AANNs. Several numerical examples are provided to compare the performances of the conventional ANNs and the proposed AANNs. In all these simulations, the AANNs are better than the conventional ANNs with the similar equivalent numbers of parameters.