Log In Sign Up

MorphNet: Fast & Simple Resource-Constrained Structure Learning of Deep Networks

by   Ariel Gordon, et al.

We present MorphNet, an approach to automate the design of neural network structures. MorphNet iteratively shrinks and expands a network, shrinking via a resource-weighted sparsifying regularizer on activations and expanding via a uniform multiplicative factor on all layers. In contrast to previous approaches, our method is scalable to large networks, adaptable to specific resource constraints (e.g. the number of floating-point operations per inference), and capable of increasing the network's performance. When applied to standard network architectures on a wide variety of datasets, our approach discovers novel structures in each domain, obtaining higher performance while respecting the resource constraint.


page 1

page 2

page 3

page 4


No Multiplication? No Floating Point? No Problem! Training Networks for Efficient Inference

For successful deployment of deep neural networks on highly--resource-co...

FAST: DNN Training Under Variable Precision Block Floating Point with Stochastic Rounding

Block Floating Point (BFP) can efficiently support quantization for Deep...

Automated Search for Resource-Efficient Branched Multi-Task Networks

The multi-modal nature of many vision problems calls for neural network ...

Glueability of resource proof-structures: inverting the Taylor expansion (long version)

A Multiplicative-Exponential Linear Logic (MELL) proof-structure can be ...

Gluing resource proof-structures: inhabitation and inverting the Taylor expansion

A Multiplicative-Exponential Linear Logic (MELL) proof-structure can be ...

DECORE: Deep Compression with Reinforcement Learning

Deep learning has become an increasingly popular and powerful option for...

Local Propagation in Constraint-based Neural Network

In this paper we study a constraint-based representation of neural netwo...