Train Deep Neural Networks in 40-D Subspaces

03/20/2021
by   Tao Li, et al.
4

Although there are massive parameters in deep neural networks, the training can actually proceed in a rather low-dimensional space. By investigating such low-dimensional properties of the training trajectory, we propose a Dynamic Linear Dimensionality Reduction (DLDR), which dramatically reduces the parameter space to a variable subspace of significantly lower dimension. Since there are only a few variables to optimize, second-order methods become applicable. Following this idea, we develop a quasi-Newton-based algorithm to train these variables obtained by DLDR, rather than the original parameters of neural networks. The experimental results strongly support the dimensionality reduction performance: for many standard neural networks, optimizing over only 40 variables, one can achieve comparable performance against the regular training over thousands or even millions of parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/27/2014

Subspace clustering of dimensionality-reduced data

Subspace clustering refers to the problem of clustering unlabeled high-d...
research
12/18/2022

The Underlying Correlated Dynamics in Neural Training

Training of neural networks is a computationally intensive task. The sig...
research
06/02/2019

Dimensionality compression and expansion in Deep Neural Networks

Datasets such as images, text, or movies are embedded in high-dimensiona...
research
11/09/2020

Improving Neural Network Training in Low Dimensional Random Bases

Stochastic Gradient Descent (SGD) has proven to be remarkably effective ...
research
11/09/2020

Numerical Exploration of Training Loss Level-Sets in Deep Neural Networks

We present a computational method for empirically characterizing the tra...
research
03/20/2022

Subspace Modeling for Fast Out-Of-Distribution and Anomaly Detection

This paper presents a fast, principled approach for detecting anomalous ...

Please sign up or login with your details

Forgot password? Click here to reset