Accelerating the Training Process of Convolutional Neural Networks for Image Classification by Dropping Training Samples Out
Stochastic gradient descent and other adaptive optimization methods have been proved effective for training deep neural networks. Within each epoch of these methods, the whole training set is involved to train the model. In general, large training data sets have data redundancy among their training...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9154363/ |