CrAM: A Compression-Aware Minimizer

07/28/2022
by   Alexandra Peşte, et al.
0

We examine the question of whether SGD-based optimization of deep neural networks (DNNs) can be adapted to produce models which are both highly-accurate and easily-compressible. We propose a new compression-aware minimizer dubbed CrAM, which modifies the SGD training iteration in a principled way, in order to produce models whose local loss behavior is stable under compression operations such as weight pruning or quantization. Experimental results on standard image classification tasks show that CrAM produces dense models that can be more accurate than standard SGD-type baselines, but which are surprisingly stable under weight pruning: for instance, for ResNet50 on ImageNet, CrAM-trained models can lose up to 70 with only minor accuracy loss.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/24/2022

Optimal Brain Compression: A Framework for Accurate Post-Training Quantization and Pruning

We consider the problem of model compression for deep neural networks (D...
research
10/14/2022

oViT: An Accurate Second-Order Pruning Framework for Vision Transformers

Models from the Vision Transformer (ViT) family have recently provided b...
research
12/05/2018

DropPruning for Model Compression

Deep neural networks (DNNs) have dramatically achieved great success on ...
research
07/29/2022

A One-Shot Reparameterization Method for Reducing the Loss of Tile Pruning on DNNs

Recently, tile pruning has been widely studied to accelerate the inferen...
research
09/27/2019

Global Sparse Momentum SGD for Pruning Very Deep Neural Networks

Deep Neural Network (DNN) is powerful but computationally expensive and ...
research
06/25/2023

Adaptive Sharpness-Aware Pruning for Robust Sparse Networks

Robustness and compactness are two essential components of deep learning...
research
11/09/2019

Hardware-aware Pruning of DNNs using LFSR-Generated Pseudo-Random Indices

Deep neural networks (DNNs) have been emerged as the state-of-the-art al...

Please sign up or login with your details

Forgot password? Click here to reset