Convergence of a Relaxed Variable Splitting Coarse Gradient Descent Method for Learning Sparse Weight Binarized Activation Neural Network
Sparsification of neural networks is one of the effective complexity reduction methods to improve efficiency and generalizability. Binarized activation offers an additional computational saving for inference. Due to vanishing gradient issue in training networks with binarized activation, coarse grad...
Main Authors: | Thu Dinh, Jack Xin |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2020-05-01
|
Series: | Frontiers in Applied Mathematics and Statistics |
Subjects: | |
Online Access: | https://www.frontiersin.org/article/10.3389/fams.2020.00013/full |
Similar Items
-
Two convergence results for continuous descent methods
by: Simeon Reich, et al.
Published: (2003-03-01) -
Linear convergence of the relaxed gradient projection algorithm for solving the split equality problems in Hilbert spaces
by: Tingting Tian, et al.
Published: (2019-03-01) -
Convergence results for a class of abstract continuous descent methods
by: Sergiu Aizicovici, et al.
Published: (2004-03-01) -
Determination of accelerated factors in gradient descent iterations based on Taylor's series
by: Petrović Milena, et al.
Published: (2017-01-01) -
Bayesian Identification of Dynamical Systems
by: Robert K. Niven, et al.
Published: (2020-02-01)