Systematic Weight Pruning of DNNs using Alternating Direction Method of Multipliers

02/15/2018
by   Tianyun Zhang, et al.
0

We present a systematic weight pruning framework of deep neural networks (DNNs) using the alternating direction method of multipliers (ADMM). We first formulate the weight pruning problem of DNNs as a constrained nonconvex optimization problem, and then adopt the ADMM framework for systematic weight pruning. We show that ADMM is highly suitable for weight pruning due to the computational efficiency it offers. We achieve a much higher compression ratio compared with prior work while maintaining the same test accuracy, together with a faster convergence rate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/10/2018

A Systematic DNN Weight Pruning Framework using Alternating Direction Method of Multipliers

Weight pruning methods for deep neural networks (DNNs) have been investi...
research
10/17/2018

Progressive Weight Pruning of Deep Neural Networks using ADMM

Deep neural networks (DNNs) although achieving human-level performance i...
research
02/06/2019

A Convergence Analysis of Nonlinearly Constrained ADMM in Deep Learning

Efficient training of deep neural networks (DNNs) is a challenge due to ...
research
03/23/2019

Progressive DNN Compression: A Key to Achieve Ultra-High Weight Pruning and Quantization Rates using ADMM

Weight pruning and weight quantization are two important categories of D...
research
01/22/2019

CAE-ADMM: Implicit Bitrate Optimization via ADMM-based Pruning in Compressive Autoencoders

We introduce CAE-ADMM (ADMM-pruned compressive autoencoder), a lossy ima...
research
12/18/2020

A Surrogate Lagrangian Relaxation-based Model Compression for Deep Neural Networks

Network pruning is a widely used technique to reduce computation cost an...
research
06/17/2018

Fast Convex Pruning of Deep Neural Networks

We develop a fast, tractable technique called Net-Trim for simplifying a...

Please sign up or login with your details

Forgot password? Click here to reset