Superposition of many models into one

02/14/2019
by   Brian Cheung, et al.
12

We present a method for storing multiple models within a single set of parameters. Models can coexist in superposition and still be retrieved individually. In experiments with neural networks, we show that a surprisingly large number of models can be effectively stored within a single parameter instance. Furthermore, each of these models can undergo thousands of training steps without significantly interfering with other models within the superposition. This approach may be viewed as the online complement of compression: rather than reducing the size of a network after training, we make use of the unrealized capacity of a network during training.

READ FULL TEXT
research
02/10/2020

Taylorized Training: Towards Better Approximation of Neural Network Training at Finite Width

We propose Taylorized training as an initiative towards better understan...
research
11/08/2018

Measuring the Effects of Data Parallelism on Neural Network Training

Recent hardware developments have made unprecedented amounts of data par...
research
02/07/2020

Understanding and Optimizing Packed Neural Network Training for Hyper-Parameter Tuning

As neural networks are increasingly employed in machine learning practic...
research
05/30/2022

Superposing Many Tickets into One: A Performance Booster for Sparse Neural Network Training

Recent works on sparse neural network training (sparse training) have sh...
research
08/07/2017

Parallelizing Over Artificial Neural Network Training Runs with Multigrid

Artificial neural networks are a popular and effective machine learning ...
research
07/11/2022

Towards Crowdsourced Training of Large Neural Networks using Decentralized Mixture-of-Experts

Many recent breakthroughs in deep learning were achieved by training inc...
research
10/27/2016

Professor Forcing: A New Algorithm for Training Recurrent Networks

The Teacher Forcing algorithm trains recurrent networks by supplying obs...

Please sign up or login with your details

Forgot password? Click here to reset