From deep to Shallow: Equivalent Forms of Deep Networks in Reproducing Kernel Krein Space and Indefinite Support Vector Machines

07/15/2020
by   Alistair Shilton, et al.
0

In this paper we explore a connection between deep networks and learning in reproducing kernel Krein space. Our approach is based on the concept of push-forward - that is, taking a fixed non-linear transform on a linear projection and converting it to a linear projection on the output of a fixed non-linear transform, aka pushing the weights forward through the non-linearity. Applying this repeatedly from the input to the output of a deep network, the weights can be progressively "pushed" to the output layer, resulting in a flat network that has the form of a fixed non-linear map (whose form is determined by the structure of the deep network) followed by a linear projection determined by the weight matrices - that is, we take a deep network and convert it to an equivalent (indefinite) support vector machine. We then investigate the implications of this transformation for capacity control and generalisation, and provide a bound on generalisation error in the deep network in terms of generalisation error in reproducing kernel Krein space.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2012

Solving Support Vector Machines in Reproducing Kernel Banach Spaces with Positive Definite Functions

In this paper we solve support vector machines in reproducing kernel Ban...
research
05/20/2005

Wavelet Time Shift Properties Integration with Support Vector Machines

This paper presents a short evaluation about the integration of informat...
research
11/24/2017

Invariance of Weight Distributions in Rectified MLPs

An interesting approach to analyzing and developing tools for neural net...
research
02/22/2019

Capacity allocation through neural network layers

Capacity analysis has been recently introduced as a way to analyze how l...
research
06/27/2017

Forecasting and Granger Modelling with Non-linear Dynamical Dependencies

Traditional linear methods for forecasting multivariate time series are ...
research
11/04/2020

Kernel Dependence Network

We propose a greedy strategy to spectrally train a deep network for mult...
research
11/03/2018

Radius-margin bounds for deep neural networks

Explaining the unreasonable effectiveness of deep learning has eluded re...

Please sign up or login with your details

Forgot password? Click here to reset