Kolmogorov Width Decay and Poor Approximators in Machine Learning: Shallow Neural Networks, Random Feature Models and Neural Tangent Kernels

05/21/2020
by   Weinan E, et al.
22

We establish a scale separation of Kolmogorov width type between subspaces of a given Banach space under the condition that a sequence of linear maps converges much faster on one of the subspaces. The general technique is then applied to show that reproducing kernel Hilbert spaces are poor L^2-approximators for the class of two-layer neural networks in high dimension, and that two-layer networks with small path norm are poor approximators for certain Lipschitz functions, also in the L^2-topology.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/20/2021

Understanding neural networks with reproducing kernel Banach spaces

Characterizing the function spaces corresponding to neural networks can ...
research
07/03/2023

Neural Hilbert Ladders: Multi-Layer Neural Networks in Function Space

The characterization of the functions spaces explored by neural networks...
research
03/06/2020

Random sampling in weighted reproducing kernel subspaces of L^p_ν(R^d)

In this paper, we mainly study the random sampling and reconstruction fo...
research
09/02/2018

On overcoming the Curse of Dimensionality in Neural Networks

Let H be a reproducing Kernel Hilbert space. For i=1,...,N, let x_i∈R^d ...
research
02/01/2023

Gradient Descent in Neural Networks as Sequential Learning in RKBS

The study of Neural Tangent Kernels (NTKs) has provided much needed insi...
research
03/04/2021

Function Approximation via Sparse Random Features

Random feature methods have been successful in various machine learning ...
research
06/10/2020

Representation formulas and pointwise properties for Barron functions

We study the natural function space for infinitely wide two-layer neural...

Please sign up or login with your details

Forgot password? Click here to reset