Triangular Dropout: Variable Network Width without Retraining

05/02/2022
by   Edward W. Staley, et al.
0

One of the most fundamental design choices in neural networks is layer width: it affects the capacity of what a network can learn and determines the complexity of the solution. This latter property is often exploited when introducing information bottlenecks, forcing a network to learn compressed representations. However, such an architecture decision is typically immutable once training begins; switching to a more compressed architecture requires retraining. In this paper we present a new layer design, called Triangular Dropout, which does not have this limitation. After training, the layer can be arbitrarily reduced in width to exchange performance for narrowness. We demonstrate the construction and potential use cases of such a mechanism in three areas. Firstly, we describe the formulation of Triangular Dropout in autoencoders, creating models with selectable compression after training. Secondly, we add Triangular Dropout to VGG19 on ImageNet, creating a powerful network which, without retraining, can be significantly reduced in parameters. Lastly, we explore the application of Triangular Dropout to reinforcement learning (RL) policies on selected control problems.

READ FULL TEXT

page 6

page 16

research
05/13/2022

Structural Dropout for Model Width Compression

Existing ML models are known to be highly over-parametrized, and use sig...
research
01/30/2022

Stochastic Neural Networks with Infinite Width are Deterministic

This work theoretically studies stochastic neural networks, a main type ...
research
04/13/2019

Shakeout: A New Approach to Regularized Deep Neural Network Training

Recent years have witnessed the success of deep neural networks in deali...
research
02/23/2022

Consistent Dropout for Policy Gradient Reinforcement Learning

Dropout has long been a staple of supervised learning, but is rarely use...
research
10/23/2020

On Convergence and Generalization of Dropout Training

We study dropout in two-layer neural networks with rectified linear unit...
research
09/17/2021

Dropout's Dream Land: Generalization from Learned Simulators to Reality

A World Model is a generative model used to simulate an environment. Wor...
research
11/17/2015

On the interplay of network structure and gradient convergence in deep learning

The regularization and output consistency behavior of dropout and layer-...

Please sign up or login with your details

Forgot password? Click here to reset