DeepAI
Log In Sign Up

MorphNet: Fast & Simple Resource-Constrained Structure Learning of Deep Networks

11/18/2017
by   Ariel Gordon, et al.
0

We present MorphNet, an approach to automate the design of neural network structures. MorphNet iteratively shrinks and expands a network, shrinking via a resource-weighted sparsifying regularizer on activations and expanding via a uniform multiplicative factor on all layers. In contrast to previous approaches, our method is scalable to large networks, adaptable to specific resource constraints (e.g. the number of floating-point operations per inference), and capable of increasing the network's performance. When applied to standard network architectures on a wide variety of datasets, our approach discovers novel structures in each domain, obtaining higher performance while respecting the resource constraint.

READ FULL TEXT

page 1

page 2

page 3

page 4

09/24/2018

No Multiplication? No Floating Point? No Problem! Training Networks for Efficient Inference

For successful deployment of deep neural networks on highly--resource-co...
10/28/2021

FAST: DNN Training Under Variable Precision Block Floating Point with Stochastic Rounding

Block Floating Point (BFP) can efficiently support quantization for Deep...
08/24/2020

Automated Search for Resource-Efficient Branched Multi-Task Networks

The multi-modal nature of many vision problems calls for neural network ...
10/17/2019

Glueability of resource proof-structures: inverting the Taylor expansion (long version)

A Multiplicative-Exponential Linear Logic (MELL) proof-structure can be ...
08/06/2020

Gluing resource proof-structures: inhabitation and inverting the Taylor expansion

A Multiplicative-Exponential Linear Logic (MELL) proof-structure can be ...
06/11/2021

DECORE: Deep Compression with Reinforcement Learning

Deep learning has become an increasingly popular and powerful option for...
02/18/2020

Local Propagation in Constraint-based Neural Network

In this paper we study a constraint-based representation of neural netwo...