Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review

The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is n...

Full description

Bibliographic Details
Main Authors: Mhaskar, Hrushikesh (Author), Rosasco, Lorenzo (Author), Miranda, Brando (Author), Liao, Qianli (Author), Poggio, Tomaso A (Contributor)
Other Authors: Center for Brains, Minds and Machines at MIT (Contributor), Massachusetts Institute of Technology. Department of Brain and Cognitive Sciences (Contributor), McGovern Institute for Brain Research at MIT (Contributor)
Format: Article
Language:English
Published: Institute of Automation, Chinese Academy of Sciences, 2017-03-23T19:40:31Z.
Subjects:
Online Access:Get fulltext