Redundancy in Deep Linear Neural Networks

06/09/2022
by   Oriel BenShmuel, et al.
0

Conventional wisdom states that deep linear neural networks benefit from expressiveness and optimization advantages over a single linear layer. This paper suggests that, in practice, the training process of deep linear fully-connected networks using conventional optimizers is convex in the same manner as a single linear fully-connected layer. This paper aims to explain this claim and demonstrate it. Even though convolutional networks are not aligned with this description, this work aims to attain a new conceptual understanding of fully-connected linear networks that might shed light on the possible constraints of convolutional settings and non-linear architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/04/2017

An Equivalence of Fully Connected Layer and Convolutional Layer

This article demonstrates that convolutional operation can be converted ...
research
10/03/2020

Computational Separation Between Convolutional and Fully-Connected Networks

Convolutional neural networks (CNN) exhibit unmatched performance in a m...
research
02/11/2015

An exploration of parameter redundancy in deep networks with circulant projections

We explore the redundancy of parameters in deep neural networks by repla...
research
02/13/2019

Identity Crisis: Memorization and Generalization under Extreme Overparameterization

We study the interplay between memorization and generalization of overpa...
research
03/23/2018

Lifting Layers: Analysis and Applications

The great advances of learning-based approaches in image processing and ...
research
10/16/2020

Why Are Convolutional Nets More Sample-Efficient than Fully-Connected Nets?

Convolutional neural networks often dominate fully-connected counterpart...
research
02/11/2020

Neural Rule Ensembles: Encoding Sparse Feature Interactions into Neural Networks

Artificial Neural Networks form the basis of very powerful learning meth...

Please sign up or login with your details

Forgot password? Click here to reset