Learning Nested Sparse Structures in Deep Neural Networks

12/11/2017
by   Eunwoo Kim, et al.
0

Recently, there have been increasing demands to construct compact deep architectures to remove unnecessary redundancy and to improve the inference speed. While many recent works focus on reducing the redundancy by eliminating unneeded weight parameters, it is not possible to apply a single deep architecture for multiple devices with different resources. When a new device or circumstantial condition requires a new deep architecture, it is necessary to construct and train a new network from scratch. In this work, we propose a novel deep learning framework, called a nested sparse network, which exploits an n-in-1-type nested structure in a neural network. A nested sparse network consists of multiple levels of networks with a different sparsity ratio associated with each level, and higher level networks share parameters with lower level networks to enable stable nested learning. The proposed framework realizes a resource-aware versatile architecture as the same network can meet diverse resource requirements. Moreover, the proposed nested network can learn different forms of knowledge in its internal networks at different levels, enabling multiple tasks using a single network, such as coarse-to-fine hierarchical classification. In order to train the proposed nested sparse network, we propose efficient weight connection learning and channel and layer scheduling strategies. We evaluate our network in multiple tasks, including adaptive deep compression, knowledge distillation, and learning class hierarchy, and demonstrate that nested sparse networks perform competitively, but more efficiently, than existing methods.

READ FULL TEXT

page 7

page 11

research
03/07/2022

Dynamic ConvNets on Tiny Devices via Nested Sparsity

This work introduces a new training and compression pipeline to build Ne...
research
04/09/2019

Deep Virtual Networks for Memory Efficient Inference of Multiple Tasks

Deep networks consume a large amount of memory by their nature. A natura...
research
07/13/2020

Nested Learning For Multi-Granular Tasks

Standard deep neural networks (DNNs) are commonly trained in an end-to-e...
research
01/27/2021

Bayesian Nested Neural Networks for Uncertainty Calibration and Adaptive Compression

Nested networks or slimmable networks are neural networks whose architec...
research
06/20/2018

Doubly Nested Network for Resource-Efficient Inference

We propose doubly nested network(DNNet) where all neurons represent thei...
research
06/18/2020

Shapeshifter Networks: Cross-layer Parameter Sharing for Scalable and Effective Deep Learning

We present Shapeshifter Networks (SSNs), a flexible neural network frame...
research
01/02/2019

Multitask Learning Deep Neural Network to Combine Revealed and Stated Preference Data

It is an enduring question how to combine revealed preference (RP) and s...

Please sign up or login with your details

Forgot password? Click here to reset