On neural network kernels and the storage capacity problem

01/12/2022
by   Jacob A. Zavatone-Veth, et al.
0

In this short note, we reify the connection between work on the storage capacity problem in wide two-layer treelike neural networks and the rapidly-growing body of literature on kernel limits of wide neural networks. Concretely, we observe that the "effective order parameter" studied in the statistical mechanics literature is exactly equivalent to the infinite-width Neural Network Gaussian Process Kernel. This correspondence connects the expressivity and trainability of wide two-layer neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/08/2022

Deep Maxout Network Gaussian Process

Study of neural networks with infinite width is important for better und...
research
08/07/2022

An Empirical Analysis of the Laplace and Neural Tangent Kernels

The neural tangent kernel is a kernel function defined over the paramete...
research
07/31/2020

Finite Versus Infinite Neural Networks: an Empirical Study

We perform a careful, thorough, and large scale empirical study of the c...
research
05/20/2020

Beyond the storage capacity: data driven satisfiability transition

Data structure has a dramatic impact on the properties of neural network...
research
04/19/2023

The Responsibility Problem in Neural Networks with Unordered Targets

We discuss the discontinuities that arise when mapping unordered objects...
research
08/18/2006

Parametrical Neural Networks and Some Other Similar Architectures

A review of works on associative neural networks accomplished during las...
research
03/15/2021

Representation Theorem for Matrix Product States

In this work, we investigate the universal representation capacity of th...

Please sign up or login with your details

Forgot password? Click here to reset