WSNet: Compact and Efficient Networks with Weight Sampling

11/28/2017
by   Xiaojie Jin, et al.
0

We present a new approach and a novel architecture, termed WSNet, for learning compact and efficient deep neural networks. Existing approaches conventionally learn full model parameters independently at first and then compress them via ad hoc processing like model pruning or filter factorization. Different from them, WSNet proposes learning model parameters by sampling from a compact set of learnable parameters, which naturally enforces parameter sharing throughout the learning process. We show that such novel weight sampling approach (and induced WSNet) promotes both weights and computation sharing favorably. It can more efficiently learn much smaller networks with competitive performance, compared to baseline networks with equal number of convolution filters. Specifically, we consider learning compact and efficient 1D convolutional neural networks for audio classification. Extensive experiments on multiple audio classification datasets verify the effectiveness of WSNet. Combined with weight quantization, the resulted models are up to 180x smaller and theoretically up to 16x faster than the well-established baselines, without noticeable performance drop.

READ FULL TEXT
research
01/05/2018

Learning 3D-FilterMap for Deep Convolutional Neural Networks

We present a novel and compact architecture for deep Convolutional Neura...
research
02/08/2019

FSNet: Compression of Deep Convolutional Neural Networks by Filter Summary

We present a novel method of compression of deep Convolutional Neural Ne...
research
08/25/2023

Learning Compact Neural Networks with Deep Overparameterised Multitask Learning

Compact neural network offers many benefits for real-world applications....
research
08/28/2021

Compact representations of convolutional neural networks via weight pruning and quantization

The state-of-the-art performance for several real-world problems is curr...
research
06/09/2020

Learning Shared Filter Bases for Efficient ConvNets

Modern convolutional neural networks (ConvNets) achieve state-of-the-art...
research
05/03/2022

Compact Neural Networks via Stacking Designed Basic Units

Unstructured pruning has the limitation of dealing with the sparse and i...
research
09/01/2022

Recurrent Convolutional Neural Networks Learn Succinct Learning Algorithms

Neural Networks (NNs) struggle to efficiently learn certain problems, su...

Please sign up or login with your details

Forgot password? Click here to reset