Shannon Entropy and Mean Square Errors for speeding the convergence of Multilayer Neural Networks: A comparative approach
Improving the efficiency and convergence rate of the Multilayer Backpropagation Neural Network Algorithms is an active area of research. The last years have witnessed an increasing attention to entropy based criteria in adaptive systems. Several principles were proposed based on the maximization or...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
Elsevier
2011-11-01
|
Series: | Egyptian Informatics Journal |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S1110866511000399 |
id |
doaj-df23444efd714cf489e425e3e1d74e85 |
---|---|
record_format |
Article |
spelling |
doaj-df23444efd714cf489e425e3e1d74e852021-07-02T17:36:26ZengElsevierEgyptian Informatics Journal1110-86652011-11-0112319720910.1016/j.eij.2011.09.002Shannon Entropy and Mean Square Errors for speeding the convergence of Multilayer Neural Networks: A comparative approachHussein Aly Kamel RadyImproving the efficiency and convergence rate of the Multilayer Backpropagation Neural Network Algorithms is an active area of research. The last years have witnessed an increasing attention to entropy based criteria in adaptive systems. Several principles were proposed based on the maximization or minimization of entropic cost functions. One way of entropy criteria in learning systems is to minimize the entropy of the error between two variables: typically one is the output of the learning system and the other is the target. In this paper, improving the efficiency and convergence rate of Multilayer Backpropagation (BP) Neural Networks was proposed. The usual Mean Square Error (MSE) minimization principle is substituted by the minimization of Shannon Entropy (SE) of the differences between the multilayer perceptions output and the desired target. These two cost functions are studied, analyzed and tested with two different activation functions namely, the Cauchy and the hyperbolic tangent activation functions. The comparative approach indicates that the Degree of convergence using Shannon Entropy cost function is higher than its counterpart using MSE and that MSE speeds the convergence than Shannon Entropy.http://www.sciencedirect.com/science/article/pii/S1110866511000399Shannon EntropyMean Square ErrorActivation functionLearning rateBackpropagation Neural Network |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Hussein Aly Kamel Rady |
spellingShingle |
Hussein Aly Kamel Rady Shannon Entropy and Mean Square Errors for speeding the convergence of Multilayer Neural Networks: A comparative approach Egyptian Informatics Journal Shannon Entropy Mean Square Error Activation function Learning rate Backpropagation Neural Network |
author_facet |
Hussein Aly Kamel Rady |
author_sort |
Hussein Aly Kamel Rady |
title |
Shannon Entropy and Mean Square Errors for speeding the convergence of Multilayer Neural Networks: A comparative approach |
title_short |
Shannon Entropy and Mean Square Errors for speeding the convergence of Multilayer Neural Networks: A comparative approach |
title_full |
Shannon Entropy and Mean Square Errors for speeding the convergence of Multilayer Neural Networks: A comparative approach |
title_fullStr |
Shannon Entropy and Mean Square Errors for speeding the convergence of Multilayer Neural Networks: A comparative approach |
title_full_unstemmed |
Shannon Entropy and Mean Square Errors for speeding the convergence of Multilayer Neural Networks: A comparative approach |
title_sort |
shannon entropy and mean square errors for speeding the convergence of multilayer neural networks: a comparative approach |
publisher |
Elsevier |
series |
Egyptian Informatics Journal |
issn |
1110-8665 |
publishDate |
2011-11-01 |
description |
Improving the efficiency and convergence rate of the Multilayer Backpropagation Neural Network Algorithms is an active area of research. The last years have witnessed an increasing attention to entropy based criteria in adaptive systems. Several principles were proposed based on the maximization or minimization of entropic cost functions. One way of entropy criteria in learning systems is to minimize the entropy of the error between two variables: typically one is the output of the learning system and the other is the target. In this paper, improving the efficiency and convergence rate of Multilayer Backpropagation (BP) Neural Networks was proposed. The usual Mean Square Error (MSE) minimization principle is substituted by the minimization of Shannon Entropy (SE) of the differences between the multilayer perceptions output and the desired target. These two cost functions are studied, analyzed and tested with two different activation functions namely, the Cauchy and the hyperbolic tangent activation functions. The comparative approach indicates that the Degree of convergence using Shannon Entropy cost function is higher than its counterpart using MSE and that MSE speeds the convergence than Shannon Entropy. |
topic |
Shannon Entropy Mean Square Error Activation function Learning rate Backpropagation Neural Network |
url |
http://www.sciencedirect.com/science/article/pii/S1110866511000399 |
work_keys_str_mv |
AT husseinalykamelrady shannonentropyandmeansquareerrorsforspeedingtheconvergenceofmultilayerneuralnetworksacomparativeapproach |
_version_ |
1721325288626847744 |