Deep Network With Approximation Error Being Reciprocal of Width to Power of Square Root of Depth
A new network with super-approximation power is introduced. This network is built with Floor (⌊x⌋) or ReLU (max{0,x}) activation function in each neuron; hence, we call such networks Floor-ReLU networks. For any hyperparameters N∈N+ and L∈N+, we show that Floor-ReLU networks with width max{d,5N+13}...
Main Authors: | Shen, Z. (Author), Yang, H. (Author), Zhang, S. (Author) |
---|---|
Format: | Article |
Language: | English |
Published: |
NLM (Medline)
2021
|
Subjects: | |
Online Access: | View Fulltext in Publisher |
Similar Items
-
Approximating square roots
by: Poirier Schmitz, Alfredo
Published: (2014) -
Modified Fast Inverse Square Root and Square Root Approximation Algorithms: The Method of Switching Magic Constants
by: Leonid V. Moroz, et al.
Published: (2021-02-01) -
Design and Analysis of an IEEE Standard 754 Reciprocal Square Root Arithmetic Unit
by: Chih-Pin Yang, et al.
Published: (2000) -
Least squares approximations of power series
by: James Guyker
Published: (2006-01-01) -
Variable precision floating point reciprocal, division and square root for major FPGA vendors
Published: ()