Traned Rank Pruning for Efficient Deep Neural Networks

10/09/2019
by   Yuhui Xu, et al.
0

To accelerate DNNs inference, low-rank approximation has been widely adopted because of its solid theoretical rationale and efficient implementations. Several previous works attempted to directly approximate a pre-trained model by low-rank decomposition; however, small approximation errors in parameters can ripple over a large prediction loss. Apparently, it is not optimal to separate low-rank approximation from training. Unlike previous works, this paper integrates low rank approximation and regularization into the training process. We propose Trained Rank Pruning (TRP), which alternates between low rank approximation and training. TRP maintains the capacity of the original network while imposing low-rank constraints during training. A nuclear regularization optimized by stochastic sub-gradient descent is utilized to further promote low rank in TRP. Networks trained with TRP has a low-rank structure in nature, and is approximated with negligible performance loss, thus eliminating fine-tuning after low rank approximation. The proposed method is comprehensively evaluated on CIFAR-10 and ImageNet, outperforming previous compression counterparts using low rank approximation. Our code is available at: https://github.com/yuhuixu1993/Trained-Rank-Pruning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/30/2020

TRP: Trained Rank Pruning for Efficient Deep Neural Networks

To enable DNNs on edge devices like mobile phones, low-rank approximatio...
research
03/28/2017

Coordinating Filters for Faster Deep Neural Networks

Very large-scale Deep Neural Networks (DNNs) have achieved remarkable su...
research
05/04/2023

Cuttlefish: Low-Rank Model Training without All the Tuning

Recent research has shown that training low-rank neural networks can eff...
research
05/25/2023

Sharpness-Aware Minimization Leads to Low-Rank Features

Sharpness-aware minimization (SAM) is a recently proposed method that mi...
research
04/12/2022

Compact Model Training by Low-Rank Projection with Energy Transfer

Low-rankness plays an important role in traditional machine learning, bu...
research
01/29/2019

On the Effect of Low-Rank Weights on Adversarial Robustness of Neural Networks

Recently, there has been an abundance of works on designing Deep Neural ...
research
03/19/2020

Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression

In this paper, we analyze two popular network compression techniques, i....

Please sign up or login with your details

Forgot password? Click here to reset