Post-training deep neural network pruning via layer-wise calibration

04/30/2021 ∙ by Ivan Lazarevich, et al. ∙ 0

We present a post-training weight pruning method for deep neural networks that achieves accuracy levels tolerable for the production setting and that is sufficiently fast to be run on commodity hardware such as desktop CPUs or edge devices. We propose a data-free extension of the approach for computer vision models based on automatically-generated synthetic fractal images. We obtain state-of-the-art results for data-free neural network pruning, with  1.5 accuracy drop for a ResNet50 on ImageNet at 50 data, we are able to get a ResNet50 model on ImageNet with 65 8-bit precision in a post-training setting with a  1 release the code as a part of the OpenVINO(TM) Post-Training Optimization tool.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.