The Supervised Learning Algorithm of Artificial Neural Networks and Its Applications

博士 === 國立中興大學 === 應用數學系 === 87 === Artificial neural networks(ANNs) are referred to as neural networks, connectionism, adaptive networks, artificial neural systems, neurocomputers, and parallel distribution processors. ANNs are mathematical models of theorized mind and exploit the massively paralle...

Full description

Bibliographic Details
Main Authors: Hsieh Chih-Ming, 謝志明
Other Authors: Chu Yen-Ping
Format: Others
Language:zh-TW
Published: 1999
Online Access:http://ndltd.ncl.edu.tw/handle/81533040658096524217
id ndltd-TW-087NCHU0507034
record_format oai_dc
spelling ndltd-TW-087NCHU05070342015-10-13T17:54:32Z http://ndltd.ncl.edu.tw/handle/81533040658096524217 The Supervised Learning Algorithm of Artificial Neural Networks and Its Applications 類神經網路監督式演算模式及其應用 Hsieh Chih-Ming 謝志明 博士 國立中興大學 應用數學系 87 Artificial neural networks(ANNs) are referred to as neural networks, connectionism, adaptive networks, artificial neural systems, neurocomputers, and parallel distribution processors. ANNs are mathematical models of theorized mind and exploit the massively parallel local processing, distributed representation properties that are believed to exist in brain. The primary intent of ANNs is to explore and reproduce human information processing tasks such as speech, vision, touch and knowledge processing. In addition, ANNs are used for data compression, near-optimal solutions to combinatorial optimization problems, pattern matching, system modeling, and function approximation. ANNs theory is derived from many disciplines including psychology, mathematics, neuroscience, physics, engineering, computer science, philosophy, biology, and linguistics. It is evident from this diverse listing that ANNs technology represents a “universalization” among the sciences working toward a common goal─building an intelligent system. It is equally evident from the listing that an accurated and complete description of the work in all the listed disciplines is an impossible task. In light of this, we will focus upon ANN paradigms that have applications or application potential. In this thesis, the improved supervised binary learning model of ANNs and some applications will be proposed. The supervised and unsupervised learning algorithms are two kinds of learning algorithms distinguished by learning methods. And the learning data could be separated to binary and continuous data. Perceptron is the earliest applicable method of supervised binary learning model, yet it solved only the linearly separable problem. Linearly nonseparable problems have been resolved by backpropagation learning algorithm using hidden units. The convergence of backpropagation was proved, but the learning maybe fail. Therefore, we proposed the improved learning structure and algorithm, which could learn successfully, based on functional link units. However, the algorithm requires much more storage for functional link units, resulting much more computation time. It is well known that if the given set of binary training patterns is linearly separable, then successful learning can be achieved using only perceptrons without any functional link unit. Consequently, to reduce the number of functional-link cells required for successful learning, it is crucial to provide an easy method for checking the linear separability of each set of binary patterns of the input training example. The method for checking the linear separability of a given set of binary training patterns in a binary supervised learning of an artificial neural network is also proposed in the thesis. If the training patterns are linearly separable, then it can be implemented by the perceptron. And there are some applications of the improved learning algorithm in later chapters. Chu Yen-Ping 朱延平 1999 學位論文 ; thesis 96 zh-TW
collection NDLTD
language zh-TW
format Others
sources NDLTD
description 博士 === 國立中興大學 === 應用數學系 === 87 === Artificial neural networks(ANNs) are referred to as neural networks, connectionism, adaptive networks, artificial neural systems, neurocomputers, and parallel distribution processors. ANNs are mathematical models of theorized mind and exploit the massively parallel local processing, distributed representation properties that are believed to exist in brain. The primary intent of ANNs is to explore and reproduce human information processing tasks such as speech, vision, touch and knowledge processing. In addition, ANNs are used for data compression, near-optimal solutions to combinatorial optimization problems, pattern matching, system modeling, and function approximation. ANNs theory is derived from many disciplines including psychology, mathematics, neuroscience, physics, engineering, computer science, philosophy, biology, and linguistics. It is evident from this diverse listing that ANNs technology represents a “universalization” among the sciences working toward a common goal─building an intelligent system. It is equally evident from the listing that an accurated and complete description of the work in all the listed disciplines is an impossible task. In light of this, we will focus upon ANN paradigms that have applications or application potential. In this thesis, the improved supervised binary learning model of ANNs and some applications will be proposed. The supervised and unsupervised learning algorithms are two kinds of learning algorithms distinguished by learning methods. And the learning data could be separated to binary and continuous data. Perceptron is the earliest applicable method of supervised binary learning model, yet it solved only the linearly separable problem. Linearly nonseparable problems have been resolved by backpropagation learning algorithm using hidden units. The convergence of backpropagation was proved, but the learning maybe fail. Therefore, we proposed the improved learning structure and algorithm, which could learn successfully, based on functional link units. However, the algorithm requires much more storage for functional link units, resulting much more computation time. It is well known that if the given set of binary training patterns is linearly separable, then successful learning can be achieved using only perceptrons without any functional link unit. Consequently, to reduce the number of functional-link cells required for successful learning, it is crucial to provide an easy method for checking the linear separability of each set of binary patterns of the input training example. The method for checking the linear separability of a given set of binary training patterns in a binary supervised learning of an artificial neural network is also proposed in the thesis. If the training patterns are linearly separable, then it can be implemented by the perceptron. And there are some applications of the improved learning algorithm in later chapters.
author2 Chu Yen-Ping
author_facet Chu Yen-Ping
Hsieh Chih-Ming
謝志明
author Hsieh Chih-Ming
謝志明
spellingShingle Hsieh Chih-Ming
謝志明
The Supervised Learning Algorithm of Artificial Neural Networks and Its Applications
author_sort Hsieh Chih-Ming
title The Supervised Learning Algorithm of Artificial Neural Networks and Its Applications
title_short The Supervised Learning Algorithm of Artificial Neural Networks and Its Applications
title_full The Supervised Learning Algorithm of Artificial Neural Networks and Its Applications
title_fullStr The Supervised Learning Algorithm of Artificial Neural Networks and Its Applications
title_full_unstemmed The Supervised Learning Algorithm of Artificial Neural Networks and Its Applications
title_sort supervised learning algorithm of artificial neural networks and its applications
publishDate 1999
url http://ndltd.ncl.edu.tw/handle/81533040658096524217
work_keys_str_mv AT hsiehchihming thesupervisedlearningalgorithmofartificialneuralnetworksanditsapplications
AT xièzhìmíng thesupervisedlearningalgorithmofartificialneuralnetworksanditsapplications
AT hsiehchihming lèishénjīngwǎnglùjiāndūshìyǎnsuànmóshìjíqíyīngyòng
AT xièzhìmíng lèishénjīngwǎnglùjiāndūshìyǎnsuànmóshìjíqíyīngyòng
AT hsiehchihming supervisedlearningalgorithmofartificialneuralnetworksanditsapplications
AT xièzhìmíng supervisedlearningalgorithmofartificialneuralnetworksanditsapplications
_version_ 1717785328612278272