Deep Layer-wise Networks Have Closed-Form Weights

02/01/2022
by   Chieh Wu, et al.
12

There is currently a debate within the neuroscience community over the likelihood of the brain performing backpropagation (BP). To better mimic the brain, training a network one layer at a time with only a "single forward pass" has been proposed as an alternative to bypass BP; we refer to these networks as "layer-wise" networks. We continue the work on layer-wise networks by answering two outstanding questions. First, do they have a closed-form solution? Second, how do we know when to stop adding more layers? This work proves that the Kernel Mean Embedding is the closed-form weight that achieves the network global optimum while driving these networks to converge towards a highly desirable kernel for classification; we call it the Neural Indicator Kernel.

READ FULL TEXT

page 33

page 34

page 35

research
11/02/2018

Closed Form Variational Objectives For Bayesian Neural Networks with a Single Hidden Layer

In this note we consider setups in which variational objectives for Baye...
research
06/10/2021

Front Contribution instead of Back Propagation

Deep Learning's outstanding track record across several domains has stem...
research
06/25/2020

A Theoretical Framework for Target Propagation

The success of deep learning, a brain-inspired form of AI, has sparked i...
research
07/14/2020

Layer-Parallel Training with GPU Concurrency of Deep Residual Neural Networks via Nonlinear Multigrid

A Multigrid Full Approximation Storage algorithm for solving Deep Residu...
research
03/11/2020

Improving the Backpropagation Algorithm with Consequentialism Weight Updates over Mini-Batches

Least mean squares (LMS) is a particular case of the backpropagation (BP...
research
12/20/2019

When Explanations Lie: Why Modified BP Attribution Fails

Modified backpropagation methods are a popular group of attribution meth...
research
02/25/2020

Convex Geometry and Duality of Over-parameterized Neural Networks

We develop a convex analytic framework for ReLU neural networks which el...

Please sign up or login with your details

Forgot password? Click here to reset