期刊论文详细信息
Frontiers in Applied Mathematics and Statistics
Deep Net Tree Structure for Balance of Capacity and Approximation Ability
Shao-Bo Lin2  Ding-Xuan Zhou2  Charles K. Chui4 
[1] Department of Mathematics, Hong Kong Baptist University, Kowloon, Hong Kong;Department of Mathematics, School of Data Science, City University of Hong Kong, Kowloon, Hong Kong;Department of Mathematics, Wenzhou University, Wenzhou, China;Department of Statistics, Stanford University, Stanford, CA, United States;
关键词: deep nets;    learning theory;    deep learning;    tree structure;    empirical risk minimization;   
DOI  :  10.3389/fams.2019.00046
来源: DOAJ
【 摘 要 】

Deep learning has been successfully used in various applications including image classification, natural language processing and game theory. The heart of deep learning is to adopt deep neural networks (deep nets for short) with certain structures to build up the estimator. Depth and structure of deep nets are two crucial factors in promoting the development of deep learning. In this paper, we propose a novel tree structure to equip deep nets to compensate the capacity drawback of deep fully connected neural networks (DFCN) and enhance the approximation ability of deep convolutional neural networks (DCNN). Based on an empirical risk minimization algorithm, we derive fast learning rates for deep nets.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次