Optimal Training Parameters and Hidden Layer Neuron Number of Two-Layer Perceptron for Generalised Scaled Object Classification Problem

The research is focused on optimising two-layer perceptron for generalised scaled object classification problem. The optimisation criterion is minimisation of inaccuracy. The inaccuracy depends on training parameters and hidden layer neuron number. After its statistics is accumulated, minimisation i...

Full description

Bibliographic Details
Main Author: Romanuke Vadim
Format: Article
Language:English
Published: Sciendo 2015-12-01
Series:Information Technology and Management Science
Subjects:
Online Access:http://www.degruyter.com/view/j/itms.2015.18.issue-1/itms-2015-0007/itms-2015-0007.xml?format=INT
Description
Summary:The research is focused on optimising two-layer perceptron for generalised scaled object classification problem. The optimisation criterion is minimisation of inaccuracy. The inaccuracy depends on training parameters and hidden layer neuron number. After its statistics is accumulated, minimisation is executed by a numerical search. Perceptron is optimised additionally by extra training. As it is done, the classification error percentage does not exceed 3 % in case of the worst scale distortion.
ISSN:2255-9094