Toward Deeper Understanding of Neural Networks: The Power of Initialization and a Dual View on Expressivity

02/18/2016
by   Amit Daniely, et al.
0

We develop a general duality between neural networks and compositional kernels, striving towards a better understanding of deep learning. We show that initial representations generated by common random initializations are sufficiently rich to express all functions in the dual kernel space. Hence, though the training objective is hard to optimize in the worst case, the initial weights form a good starting point for optimization. Our dual view also reveals a pragmatic and aesthetic perspective of neural networks and underscores their expressive power.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2022

Duality for Neural Networks through Reproducing Kernel Banach Spaces

Reproducing Kernel Hilbert spaces (RKHS) have been a very successful too...
research
11/13/2015

On the Quality of the Initial Basin in Overspecified Neural Networks

Deep learning, in the form of artificial neural networks, has achieved r...
research
02/11/2022

The Dual Form of Neural Networks Revisited: Connecting Test Time Predictions to Training Patterns via Spotlights of Attention

Linear layers in neural networks (NNs) trained by gradient descent can b...
research
04/09/2020

Mehler's Formula, Branching Process, and Compositional Kernels of Deep Neural Networks

In this paper, we utilize a connection between compositional kernels and...
research
07/01/2019

On Symmetry and Initialization for Neural Networks

This work provides an additional step in the theoretical understanding o...
research
07/20/2016

On the Modeling of Error Functions as High Dimensional Landscapes for Weight Initialization in Learning Networks

Next generation deep neural networks for classification hosted on embedd...
research
09/12/2018

Linear Algebra and Duality of Neural Networks

Natural for Neural networks bases, mappings, projections and metrics are...

Please sign up or login with your details

Forgot password? Click here to reset