Scaling Up Exact Neural Network Compression by ReLU Stability

02/15/2021
by   Thiago Serra, et al.
5

We can compress a neural network while exactly preserving its underlying functionality with respect to a given input domain if some of its neurons are stable. However, current approaches to determine the stability of neurons in networks with Rectified Linear Unit (ReLU) activations require solving or finding a good approximation to multiple discrete optimization problems. In this work, we introduce an algorithm based on solving a single optimization problem to identify all stable neurons. Our approach is on median 21 times faster than the state-of-art method, which allows us to explore exact compression on deeper (5 x 100) and wider (2 x 800) networks within minutes. For classifiers trained under an amount of L1 regularization that does not worsen accuracy, we can remove up to 40

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

04/25/2018

Towards Fast Computation of Certified Robustness for ReLU Networks

Verifying the robustness property of a general Rectified Linear Unit (Re...
11/15/2020

Stability Analysis of Complementarity Systems with Neural Network Controllers

Complementarity problems, a class of mathematical optimization problems ...
02/21/2018

Interpreting Neural Network Judgments via Minimal, Stable, and Symbolic Corrections

The paper describes a new algorithm to generate minimal, stable, and sym...
12/13/2021

Acceleration techniques for optimization over trained neural network ensembles

We study optimization problems where the objective function is modeled t...
05/28/2020

Exploiting Non-Linear Redundancy for Neural Model Compression

Deploying deep learning models, comprising of non-linear combination of ...
01/01/2020

Lossless Compression of Deep Neural Networks

Deep neural networks have been successful in many predictive modeling ta...
05/13/2020

The effect of Target Normalization and Momentum on Dying ReLU

Optimizing parameters with momentum, normalizing data values, and using ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.