Efficient deep convolutional model compression with an active stepwise pruning approach
by Shengsheng Wang; Chunshang Xing; Dong Liu
International Journal of Computational Science and Engineering (IJCSE), Vol. 22, No. 4, 2020

Abstract: Deep models are structurally tremendous and complex, thus making it hard to deploy on the embedded hardware with restricted memory and computing power. Although, the existing compression methods have pruned the deep models effectively, some issues exist in those methods, such as multiple iterations needed in fine-tuning phase, difficulty in pruning granularity control and numerous hyperparameters needed to set. In this paper, we propose an active stepwise pruning method of a logarithmic function which only needs to set three hyperparameters and a few epochs. We also propose a recovery strategy to repair the incorrect pruning thus ensuring the prediction accuracy of model. Pruning and repairing alternately constitute cyclic process along with updating the weights in layers. Our method can prune the parameters of MobileNet, AlexNet, VGG-16 and ZFNet by a factor of 5.6×, 11.7×, 16.6× and 15× respectively without any accuracy loss, which precedes the existing methods.

Online publication date: Tue, 08-Sep-2020

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computational Science and Engineering (IJCSE):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com