Smaller Models, Better Generalization

08/29/2019
by   Mayank Sharma, et al.
Indian Institute of Technology Delhi
40

Reducing network complexity has been a major research focus in recent years with the advent of mobile technology. Convolutional Neural Networks that perform various vision tasks without memory overhaul is the need of the hour. This paper focuses on qualitative and quantitative analysis of reducing the network complexity using an upper bound on the Vapnik-Chervonenkis dimension, pruning, and quantization. We observe a general trend in improvement of accuracies as we quantize the models. We propose a novel loss function that helps in achieving considerable sparsity at comparable accuracies to that of dense models. We compare various regularizations prevalent in the literature and show the superiority of our method in achieving sparser models that generalize well.

READ FULL TEXT

page 1

page 2

page 3

page 4

02/03/2022

PRUNIX: Non-Ideality Aware Convolutional Neural Network Pruning for Memristive Accelerators

In this work, PRUNIX, a framework for training and pruning convolutional...
03/07/2019

Efficient and Effective Quantization for Sparse DNNs

Deep convolutional neural networks (CNNs) are powerful tools for a wide ...
12/08/2021

Neural Network Quantization for Efficient Inference: A Survey

As neural networks have become more powerful, there has been a rising de...
02/07/2020

Activation Density driven Energy-Efficient Pruning in Training

The process of neural network pruning with suitable fine-tuning and retr...
12/02/2021

Putting 3D Spatially Sparse Networks on a Diet

3D neural networks have become prevalent for many 3D vision tasks includ...
10/11/2022

Low Complexity Convolutional Neural Networks for Equalization in Optical Fiber Transmission

A convolutional neural network is proposed to mitigate fiber transmissio...
10/03/2020

Nonconvex Regularization for Network Slimming:Compressing CNNs Even More

In the last decade, convolutional neural networks (CNNs) have evolved to...

Please sign up or login with your details

Forgot password? Click here to reset