期刊论文详细信息
IEEE Access
Global Biased Pruning Considering Layer Contribution
Li Li1  Hailin Sun1  Zheng Huang1 
[1] School of Electronic and Information Engineering Beihang University, Beijing, China;
关键词: Deep learning;    network pruning;    convolutional neural networks;   
DOI  :  10.1109/ACCESS.2020.3025130
来源: DOAJ
【 摘 要 】

Convolutional neural networks (CNNs) have made impressive achievements in many areas, but these successes are limited by storage and computing costs. Filter pruning is a promising solution to accelerate and compress CNNs. Most existing methods for filter pruning only consider the role of the filter itself, ignoring the characteristics of the layer. In this paper, we propose a global biased filter pruning method considering layer contribution, which tends to preferentially remove weak filters in weak layers. The impact of each layer on final performance is quantitatively analyzed, and such the improvement between adjacent layers is exploited to represent the layer contribution and determine the weak layers. We introduce layer weight and Taylor expansion to jointly evaluate the filters in different layer, and remove the least important filters to compress CNNs. And then, fine-tune the CNNs to restore their predictive power. The experiment results show that the proposed approach could crop 92.63%, 99.06%, 57.60% and 58.97% parameters of VGG16, MobileNetV1, ResNet32, and ResNet56 respectively on CIFAR10, 78.29% and 62.28% parameters of VGG16 and ResNet56 respectively on CIFAR100, which outperforms other methods, and removes 92.30% parameters on Tiny-Yolov2 with a negligible mAP loss.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次