Deep Net Tree Structure for Balance of Capacity and Approximation Ability
Deep learning has been successfully used in various applications including image classification, natural language processing and game theory. The heart of deep learning is to adopt deep neural networks (deep nets for short) with certain structures to build up the estimator. Depth and structure of de...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2019-09-01
|
Series: | Frontiers in Applied Mathematics and Statistics |
Subjects: | |
Online Access: | https://www.frontiersin.org/article/10.3389/fams.2019.00046/full |
id |
doaj-ccbe31daf2014df1a27a653ad84a9a1e |
---|---|
record_format |
Article |
spelling |
doaj-ccbe31daf2014df1a27a653ad84a9a1e2020-11-25T02:58:21ZengFrontiers Media S.A.Frontiers in Applied Mathematics and Statistics2297-46872019-09-01510.3389/fams.2019.00046479639Deep Net Tree Structure for Balance of Capacity and Approximation AbilityCharles K. Chui0Charles K. Chui1Shao-Bo Lin2Shao-Bo Lin3Ding-Xuan Zhou4Department of Mathematics, Hong Kong Baptist University, Kowloon, Hong KongDepartment of Statistics, Stanford University, Stanford, CA, United StatesDepartment of Mathematics, Wenzhou University, Wenzhou, ChinaDepartment of Mathematics, School of Data Science, City University of Hong Kong, Kowloon, Hong KongDepartment of Mathematics, School of Data Science, City University of Hong Kong, Kowloon, Hong KongDeep learning has been successfully used in various applications including image classification, natural language processing and game theory. The heart of deep learning is to adopt deep neural networks (deep nets for short) with certain structures to build up the estimator. Depth and structure of deep nets are two crucial factors in promoting the development of deep learning. In this paper, we propose a novel tree structure to equip deep nets to compensate the capacity drawback of deep fully connected neural networks (DFCN) and enhance the approximation ability of deep convolutional neural networks (DCNN). Based on an empirical risk minimization algorithm, we derive fast learning rates for deep nets.https://www.frontiersin.org/article/10.3389/fams.2019.00046/fulldeep netslearning theorydeep learningtree structureempirical risk minimization |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Charles K. Chui Charles K. Chui Shao-Bo Lin Shao-Bo Lin Ding-Xuan Zhou |
spellingShingle |
Charles K. Chui Charles K. Chui Shao-Bo Lin Shao-Bo Lin Ding-Xuan Zhou Deep Net Tree Structure for Balance of Capacity and Approximation Ability Frontiers in Applied Mathematics and Statistics deep nets learning theory deep learning tree structure empirical risk minimization |
author_facet |
Charles K. Chui Charles K. Chui Shao-Bo Lin Shao-Bo Lin Ding-Xuan Zhou |
author_sort |
Charles K. Chui |
title |
Deep Net Tree Structure for Balance of Capacity and Approximation Ability |
title_short |
Deep Net Tree Structure for Balance of Capacity and Approximation Ability |
title_full |
Deep Net Tree Structure for Balance of Capacity and Approximation Ability |
title_fullStr |
Deep Net Tree Structure for Balance of Capacity and Approximation Ability |
title_full_unstemmed |
Deep Net Tree Structure for Balance of Capacity and Approximation Ability |
title_sort |
deep net tree structure for balance of capacity and approximation ability |
publisher |
Frontiers Media S.A. |
series |
Frontiers in Applied Mathematics and Statistics |
issn |
2297-4687 |
publishDate |
2019-09-01 |
description |
Deep learning has been successfully used in various applications including image classification, natural language processing and game theory. The heart of deep learning is to adopt deep neural networks (deep nets for short) with certain structures to build up the estimator. Depth and structure of deep nets are two crucial factors in promoting the development of deep learning. In this paper, we propose a novel tree structure to equip deep nets to compensate the capacity drawback of deep fully connected neural networks (DFCN) and enhance the approximation ability of deep convolutional neural networks (DCNN). Based on an empirical risk minimization algorithm, we derive fast learning rates for deep nets. |
topic |
deep nets learning theory deep learning tree structure empirical risk minimization |
url |
https://www.frontiersin.org/article/10.3389/fams.2019.00046/full |
work_keys_str_mv |
AT charleskchui deepnettreestructureforbalanceofcapacityandapproximationability AT charleskchui deepnettreestructureforbalanceofcapacityandapproximationability AT shaobolin deepnettreestructureforbalanceofcapacityandapproximationability AT shaobolin deepnettreestructureforbalanceofcapacityandapproximationability AT dingxuanzhou deepnettreestructureforbalanceofcapacityandapproximationability |
_version_ |
1724706855480983552 |