LOss-Based SensiTivity rEgulaRization: towards deep sparse neural networks

11/16/2020
by   Enzo Tartaglione, et al.
0

LOBSTER (LOss-Based SensiTivity rEgulaRization) is a method for training neural networks having a sparse topology. Let the sensitivity of a network parameter be the variation of the loss function with respect to the variation of the parameter. Parameters with low sensitivity, i.e. having little impact on the loss when perturbed, are shrunk and then pruned to sparsify the network. Our method allows to train a network from scratch, i.e. without preliminary learning or rewinding. Experiments on multiple architectures and datasets show competitive compression ratios with minimal computational overhead.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2021

SeReNe: Sensitivity based Regularization of Neurons for Structured Sparsity in Neural Networks

Deep neural networks include millions of learnable parameters, making th...
research
10/28/2018

Learning Sparse Neural Networks via Sensitivity-Driven Regularization

The ever-increasing number of parameters in deep neural networks poses c...
research
01/06/2023

Sensitivity analysis using Physics-informed neural networks

The paper's goal is to provide a simple unified approach to perform sens...
research
10/12/2018

Optimal Architecture for Deep Neural Networks with Heterogeneous Sensitivity

This work presents a neural network that consists of nodes with heteroge...
research
06/09/2021

Network insensitivity to parameter noise via adversarial regularization

Neuromorphic neural network processors, in the form of compute-in-memory...
research
07/15/2021

Lockout: Sparse Regularization of Neural Networks

Many regression and classification procedures fit a parameterized functi...
research
02/15/2019

Asymptotic Finite Sample Information Losses in Neural Classifiers

This paper considers the subject of information losses arising from fini...

Please sign up or login with your details

Forgot password? Click here to reset