Deep Network With Approximation Error Being Reciprocal of Width to Power of Square Root of Depth

A new network with super-approximation power is introduced. This network is built with Floor (⌊x⌋) or ReLU (max{0,x}) activation function in each neuron; hence, we call such networks Floor-ReLU networks. For any hyperparameters N∈N+ and L∈N+, we show that Floor-ReLU networks with width max{d,5N+13}...

Full description

Bibliographic Details
Main Authors: Shen, Z. (Author), Yang, H. (Author), Zhang, S. (Author)
Format: Article
Language:English
Published: NLM (Medline) 2021
Subjects:
Online Access:View Fulltext in Publisher