Computer and Modernization ›› 2021, Vol. 0 ›› Issue (12): 13-18.

Previous Articles     Next Articles

Channel Pruning of Convolutional Neural Network Based on Transfer Learning

  

  1. (North China Institute of Computing Technology, Beijing 100083, China)
  • Online:2021-12-24 Published:2021-12-24

Abstract: Convolutional neural networks are widely used in many fields like computer vision. However, large number of model parameters and huge cost make many edge devices unable to offer enough storage and computing resource. Aiming at problems above, a migration learning method is introduced to improve the sparsity proportion of the channel pruning method based on the scaling factor of the BN layer. The effects of different levels of migration on the sparsity proportion and channel pruning are compared, and  experiments based on the NAS viewpoint are designed  to explore its pruning accuracy limit and iterative structure convergence. The results show that compared with the original model, with the accuracy loss under 0.10, the parameter amount is reduced by 89.1%, and the model storage size is reduced by 89.3%. Compared with the original pruning method, the pruning threshold is increased from 0.85 to 0.97, further reducing the parametes by 42.6%. Experiments have proved that the introduction of migration method makes it easier to fully sparse the weights, increases the tolerance of the channel pruning threshold, and gets a higher compression rate. In the pruning network architecture search process, the migration provides a more efficient starting point to search, which seems easy to converge to a local optimal solution of the NAS.

Key words: convolutional neural network, migration learning, channel pruning, neural architecture search