Size-independent sample complexity of neural networks

We study the sample complexity of learning neural networks by providing new bounds on their Rademacher complexity, assuming norm constraints on the parameter matrix of each layer. Compared to previous work, these complexity bounds have improved dependence on the network depth and, under some additio...

Full description

Bibliographic Details
Main Authors: Golowich, Noah (Author), Rakhlin, Alexander (Author), Shamir, Ohad (Author)
Format: Article
Language:English
Published: Oxford University Press (OUP), 2021-12-03T15:57:49Z.
Subjects:
Online Access:Get fulltext
LEADER 01006 am a22001813u 4500
001 138309
042 |a dc 
100 1 0 |a Golowich, Noah  |e author 
700 1 0 |a Rakhlin, Alexander  |e author 
700 1 0 |a Shamir, Ohad  |e author 
245 0 0 |a Size-independent sample complexity of neural networks 
260 |b Oxford University Press (OUP),   |c 2021-12-03T15:57:49Z. 
856 |z Get fulltext  |u https://hdl.handle.net/1721.1/138309 
520 |a We study the sample complexity of learning neural networks by providing new bounds on their Rademacher complexity, assuming norm constraints on the parameter matrix of each layer. Compared to previous work, these complexity bounds have improved dependence on the network depth and, under some additional assumptions, are fully independent of the network size (both depth and width). These results are derived using some novel techniques, which may be of independent interest. 
546 |a en 
655 7 |a Article 
773 |t 10.1093/IMAIAI/IAZ007 
773 |t Information and Inference