Zero-Keep Filter Pruning for Energy/Power Efficient Deep Neural Networks
Recent deep learning models succeed in achieving high accuracy and fast inference time, but they require high-performance computing resources because they have a large number of parameters. However, not all systems have high-performance hardware. Sometimes, a deep learning model needs to be run on e...
Main Authors: | Yunhee Woo, Dongyoung Kim, Jaemin Jeong, Young-Woong Ko, Jeong-Gun Lee |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-05-01
|
Series: | Electronics |
Subjects: | |
Online Access: | https://www.mdpi.com/2079-9292/10/11/1238 |
Similar Items
-
Efficient Convolution Neural Networks for Object Tracking Using Separable Convolution and Filter Pruning
by: Yuanhong Mao, et al.
Published: (2019-01-01) -
Pruning optimization based on deep convolution neural network
by: Ma Zhinan, et al.
Published: (2018-12-01) -
Compressing Convolutional Neural Networks by Pruning Density Peak Filters
by: Yunseok Jang, et al.
Published: (2021-01-01) -
Human Segmentation Based on Compressed Deep Convolutional Neural Network
by: Jun Miao, et al.
Published: (2020-01-01) -
Filter Pruning Without Damaging Networks Capacity
by: Yuding Zuo, et al.
Published: (2020-01-01)