Deep regularization and direct training of the inner layers of Neural Networks with Kernel Flows

02/19/2020
by   Gene Ryan Yoo, et al.
0

We introduce a new regularization method for Artificial Neural Networks (ANNs) based on Kernel Flows (KFs). KFs were introduced as a method for kernel selection in regression/kriging based on the minimization of the loss of accuracy incurred by halving the number of interpolation points in random batches of the dataset. Writing f_θ(x) = (f^(n)_θ_n∘ f^(n-1)_θ_n-1∘...∘ f^(1)_θ_1)(x) for the functional representation of compositional structure of the ANN, the inner layers outputs h^(i)(x) = (f^(i)_θ_i∘ f^(i-1)_θ_i-1∘...∘ f^(1)_θ_1)(x) define a hierarchy of feature maps and kernels k^(i)(x,x')=(- γ_i h^(i)(x)-h^(i)(x')_2^2). When combined with a batch of the dataset these kernels produce KF losses e_2^(i) (the L^2 regression error incurred by using a random half of the batch to predict the other half) depending on parameters of inner layers θ_1,...,θ_i (and γ_i). The proposed method simply consists in aggregating a subset of these KF losses with a classical output loss. We test the proposed method on CNNs and WRNs without alteration of structure nor output classifier and report reduced test errors, decreased generalization gaps, and increased robustness to distribution shift without significant increase in computational complexity. We suspect that these results might be explained by the fact that while conventional training only employs a linear functional (a generalized moment) of the empirical distribution defined by the dataset and can be prone to trapping in the Neural Tangent Kernel regime (under over-parameterizations), the proposed loss function (defined as a nonlinear functional of the empirical distribution) effectively trains the underlying kernel defined by the CNN beyond regressing the data with that kernel.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/26/2023

A Simple Algorithm For Scaling Up Kernel Methods

The recent discovery of the equivalence between infinitely wide neural n...
research
04/13/2021

Gradient Kernel Regression

In this article a surprising result is demonstrated using the neural tan...
research
03/04/2020

Neural Kernels Without Tangents

We investigate the connections between neural networks and simple buildi...
research
06/03/2022

Learning "best" kernels from data in Gaussian process regression. With application to aerodynamics

This paper introduces algorithms to select/design kernels in Gaussian pr...
research
08/13/2018

Kernel Flows: from learning kernels from data into the abyss

Learning can be seen as approximating an unknown function by interpolati...
research
09/17/2020

Distributional Generalization: A New Kind of Generalization

We introduce a new notion of generalization – Distributional Generalizat...
research
01/21/2021

Generative Autoencoder Kernels on Deep Learning for Brain Activity Analysis

Deep Learning (DL) is a two-step classification model that consists feat...

Please sign up or login with your details

Forgot password? Click here to reset