Low-rank passthrough neural networks

Deep learning consists in training neural networks to perform computations that sequentially unfold in many steps over a time dimension or an intrinsic depth dimension. Effective learning in this setting is usually accomplished by specialized network architectures that are designed to mitigate the vanishing gradient problem of naive deep networks. Many of these architectures, such as LSTMs, GRUs, Highway Networks and Deep Residual Network, are based on a single structural principle: the state passthrough. We observe that these architectures, hereby characterized as Passthrough Networks, in addition to the mitigation of the vanishing gradient problem, enable the decoupling of the network state size from the number of parameters of the network, a possibility that is exploited in some recent works but not thoroughly explored. In this work we propose simple, yet effective, low-rank and low-rank plus diagonal matrix parametrizations for Passthrough Networks which exploit this decoupling property, reducing the data complexity and memory requirements of the network while preserving its memory capacity. We present competitive experimental results on synthetic tasks and a near state of the art result on sequential randomly-permuted MNIST classification, a hard task on natural data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2023

Experimental observation on a low-rank tensor model for eigenvalue problems

Here we utilize a low-rank tensor model (LTM) as a function approximator...
research
02/02/2022

Algorithms for Efficiently Learning Low-Rank Neural Networks

We study algorithms for learning low-rank neural networks – networks whe...
research
01/31/2023

On the Initialisation of Wide Low-Rank Feedforward Neural Networks

The edge-of-chaos dynamics of wide randomly initialized low-rank feedfor...
research
02/07/2023

Sketchy: Memory-efficient Adaptive Regularization with Frequent Directions

Adaptive regularization methods that exploit more than the diagonal entr...
research
06/13/2022

Rank Diminishing in Deep Neural Networks

The rank of neural networks measures information flowing across layers. ...
research
06/10/2020

Sketchy Empirical Natural Gradient Methods for Deep Learning

In this paper, we develop an efficient sketchy empirical natural gradien...
research
09/06/2016

Law of Large Graphs

Estimating the mean of a population of graphs based on a sample is a cor...

Please sign up or login with your details

Forgot password? Click here to reset