Exploring the efficiency of the combined application of connection pruning and source data pre­processing when training a multilayer perceptron

A conventional scheme to operate neural networks until recently has been assigning the architecture of a neural network and its subsequent training. However, the latest research in this field has revealed that the neural networks that had been set and configured in this way exhibited considerable re...

Full description

Bibliographic Details
Main Authors: Oleg Galchonkov, Alexander Nevrev, Maria Glava, Mykola Babych
Format: Article
Language:English
Published: PC Technology Center 2020-04-01
Series:Eastern-European Journal of Enterprise Technologies
Subjects:
Online Access:http://journals.uran.ua/eejet/article/view/200819
id doaj-3a9e4c038c4040a39537c949d1f9859c
record_format Article
spelling doaj-3a9e4c038c4040a39537c949d1f9859c2020-11-25T02:04:38ZengPC Technology CenterEastern-European Journal of Enterprise Technologies1729-37741729-40612020-04-0129 (104)61310.15587/1729-4061.2020.200819200819Exploring the efficiency of the combined application of connection pruning and source data pre­processing when training a multilayer perceptronOleg Galchonkov0Alexander Nevrev1Maria Glava2Mykola Babych3Odessa National Polytechnic University Shevchenka ave., 1, Odessa, Ukraine, 65044Odessa National Polytechnic University Shevchenka ave., 1, Odessa, Ukraine, 65044Odessa National Polytechnic University Shevchenka ave., 1, Odessa, Ukraine, 65044Odessa National Polytechnic University Shevchenka ave., 1, Odessa, Ukraine, 65044A conventional scheme to operate neural networks until recently has been assigning the architecture of a neural network and its subsequent training. However, the latest research in this field has revealed that the neural networks that had been set and configured in this way exhibited considerable redundancy. Therefore, the additional operation was to eliminate this redundancy by pruning the connections in the architecture of a neural network. Among the many approaches to eliminating redundancy, the most promising one is the combined application of several methods when their cumulative effect exceeds the sum of effects from employing each of them separately. We have performed an experimental study into the effectiveness of the combined application of iterative pruning and pre-processing (pre-distortions) of input data for the task of recognizing handwritten digits with the help of a multilayer perceptron. It has been shown that the use of input data pre-processing regularizes the procedure of training a neural network, thereby preventing its retraining. The combined application of the iterative pruning and pre-processing of input data has made it possible to obtain a smaller error in the recognition of handwritten digits, 1.22 %, compared to when using the thinning only (the error decreased from 1.89 % to 1.81 %) and when employing the predistortions only (the error decreased from 1.89 % to 1.52 %). In addition, the regularization involving pre-distortions makes it possible to receive a monotonously increasing number of disconnected connections while maintaining the error at 1.45 %. The resulting learning curves for the same task but corresponding to the onset of training under different initial conditions acquire different values both in the learning process and at the end of the training. This shows the multi-extreme character of the quality function – the accuracy of recognition. The practical implication of the study is our proposal to run the multiple training of a neural network in order to choose the best resulthttp://journals.uran.ua/eejet/article/view/200819multilayer perceptronneural networkpruningregularizationlearning curveweight coefficients
collection DOAJ
language English
format Article
sources DOAJ
author Oleg Galchonkov
Alexander Nevrev
Maria Glava
Mykola Babych
spellingShingle Oleg Galchonkov
Alexander Nevrev
Maria Glava
Mykola Babych
Exploring the efficiency of the combined application of connection pruning and source data pre­processing when training a multilayer perceptron
Eastern-European Journal of Enterprise Technologies
multilayer perceptron
neural network
pruning
regularization
learning curve
weight coefficients
author_facet Oleg Galchonkov
Alexander Nevrev
Maria Glava
Mykola Babych
author_sort Oleg Galchonkov
title Exploring the efficiency of the combined application of connection pruning and source data pre­processing when training a multilayer perceptron
title_short Exploring the efficiency of the combined application of connection pruning and source data pre­processing when training a multilayer perceptron
title_full Exploring the efficiency of the combined application of connection pruning and source data pre­processing when training a multilayer perceptron
title_fullStr Exploring the efficiency of the combined application of connection pruning and source data pre­processing when training a multilayer perceptron
title_full_unstemmed Exploring the efficiency of the combined application of connection pruning and source data pre­processing when training a multilayer perceptron
title_sort exploring the efficiency of the combined application of connection pruning and source data pre­processing when training a multilayer perceptron
publisher PC Technology Center
series Eastern-European Journal of Enterprise Technologies
issn 1729-3774
1729-4061
publishDate 2020-04-01
description A conventional scheme to operate neural networks until recently has been assigning the architecture of a neural network and its subsequent training. However, the latest research in this field has revealed that the neural networks that had been set and configured in this way exhibited considerable redundancy. Therefore, the additional operation was to eliminate this redundancy by pruning the connections in the architecture of a neural network. Among the many approaches to eliminating redundancy, the most promising one is the combined application of several methods when their cumulative effect exceeds the sum of effects from employing each of them separately. We have performed an experimental study into the effectiveness of the combined application of iterative pruning and pre-processing (pre-distortions) of input data for the task of recognizing handwritten digits with the help of a multilayer perceptron. It has been shown that the use of input data pre-processing regularizes the procedure of training a neural network, thereby preventing its retraining. The combined application of the iterative pruning and pre-processing of input data has made it possible to obtain a smaller error in the recognition of handwritten digits, 1.22 %, compared to when using the thinning only (the error decreased from 1.89 % to 1.81 %) and when employing the predistortions only (the error decreased from 1.89 % to 1.52 %). In addition, the regularization involving pre-distortions makes it possible to receive a monotonously increasing number of disconnected connections while maintaining the error at 1.45 %. The resulting learning curves for the same task but corresponding to the onset of training under different initial conditions acquire different values both in the learning process and at the end of the training. This shows the multi-extreme character of the quality function – the accuracy of recognition. The practical implication of the study is our proposal to run the multiple training of a neural network in order to choose the best result
topic multilayer perceptron
neural network
pruning
regularization
learning curve
weight coefficients
url http://journals.uran.ua/eejet/article/view/200819
work_keys_str_mv AT oleggalchonkov exploringtheefficiencyofthecombinedapplicationofconnectionpruningandsourcedatapreprocessingwhentrainingamultilayerperceptron
AT alexandernevrev exploringtheefficiencyofthecombinedapplicationofconnectionpruningandsourcedatapreprocessingwhentrainingamultilayerperceptron
AT mariaglava exploringtheefficiencyofthecombinedapplicationofconnectionpruningandsourcedatapreprocessingwhentrainingamultilayerperceptron
AT mykolababych exploringtheefficiencyofthecombinedapplicationofconnectionpruningandsourcedatapreprocessingwhentrainingamultilayerperceptron
_version_ 1724942011202535424