Layer Leveled Knowledge Distillation for Deep Neural Network Learning

碩士 === 國立中正大學 === 資訊工程研究所 === 106 === With the popularity of deep learning and the improvement of computing power, neural network becomes deeper and bigger. Despite the complexity of model, there are two challenges remain for deep model training. One is expensive computational costs and the other is...

Full description

Bibliographic Details
Main Authors: LIN, SHIH-CHIEH, 林仕杰
Other Authors: CHIANG, CHEN-KUO
Format: Others
Language:en_US
Published: 2018
Online Access:http://ndltd.ncl.edu.tw/handle/2mpqs6