Gradient-Free Learning Based on the Kernel and the Range Space

10/27/2018
by   Kar-Ann Toh, et al.
0

In this article, we show that solving the system of linear equations by manipulating the kernel and the range space is equivalent to solving the problem of least squares error approximation. This establishes the ground for a gradient-free learning search when the system can be expressed in the form of a linear matrix equation. When the nonlinear activation function is invertible, the learning problem of a fully-connected multilayer feedforward neural network can be easily adapted for this novel learning framework. By a series of kernel and range space manipulations, it turns out that such a network learning boils down to solving a set of cross-coupling equations. By having the weights randomly initialized, the equations can be decoupled and the network solution shows relatively good learning capability for real world data sets of small to moderate dimensions. Based on the structural information of the matrix equation, the network representation is found to be dependent on the number of data samples and the output dimension.

READ FULL TEXT

page 20

page 21

page 22

research
10/22/2018

Learning from the Kernel and the Range Space

In this article, a novel approach to learning a complex function which c...
research
11/20/2018

Analytic Network Learning

Based on the property that solving the system of linear matrix equations...
research
03/18/2017

Fully symmetric kernel quadrature

Kernel quadratures and other kernel-based approximation methods typicall...
research
11/24/2017

Invariance of Weight Distributions in Rectified MLPs

An interesting approach to analyzing and developing tools for neural net...
research
06/06/2021

On the Power of Shallow Learning

A deluge of recent work has explored equivalences between wide neural ne...
research
11/02/2019

The Intrinsic Properties of Brain Based on the Network Structure

Objective: Brain is a fantastic organ that helps creature adapting to th...
research
03/25/2019

Non-recursive equivalent of the conjugate gradient method without the need to restart

A simple alternative to the conjugate gradient(CG) method is presented; ...

Please sign up or login with your details

Forgot password? Click here to reset