A duality framework for generalization analysis of random feature models and two-layer neural networks

05/09/2023
by   Hongrui Chen, et al.
0

We consider the problem of learning functions in the ℱ_p,π and Barron spaces, which are natural function spaces that arise in the high-dimensional analysis of random feature models (RFMs) and two-layer neural networks. Through a duality analysis, we reveal that the approximation and estimation of these spaces can be considered equivalent in a certain sense. This enables us to focus on the easier problem of approximation and estimation when studying the generalization of both models. The dual equivalence is established by defining an information-based complexity that can effectively control estimation errors. Additionally, we demonstrate the flexibility of our duality framework through comprehensive analyses of two concrete applications. The first application is to study learning functions in ℱ_p,π with RFMs. We prove that the learning does not suffer from the curse of dimensionality as long as p>1, implying RFMs can work beyond the kernel regime. Our analysis extends existing results [CMM21] to the noisy case and removes the requirement of overparameterization. The second application is to investigate the learnability of reproducing kernel Hilbert space (RKHS) under the L^∞ metric. We derive both lower and upper bounds of the minimax estimation error by using the spectrum of the associated kernel. We then apply these bounds to dot-product kernels and analyze how they scale with the input dimension. Our results suggest that learning with ReLU (random) features is generally intractable in terms of reaching high uniform accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/05/2023

The L^∞ Learnability of Reproducing Kernel Hilbert Spaces

In this work, we analyze the learnability of reproducing kernel Hilbert ...
research
11/09/2022

Duality for Neural Networks through Reproducing Kernel Banach Spaces

Reproducing Kernel Hilbert spaces (RKHS) have been a very successful too...
research
03/04/2021

Function Approximation via Sparse Random Features

Random feature methods have been successful in various machine learning ...
research
09/13/2021

Uniform Generalization Bounds for Overparameterized Neural Networks

An interesting observation in artificial neural networks is their favora...
research
06/09/2021

Harmless Overparametrization in Two-layer Neural Networks

Overparametrized neural networks, where the number of active parameters ...
research
10/03/2019

Beyond Linearization: On Quadratic and Higher-Order Approximation of Wide Neural Networks

Recent theoretical work has established connections between over-paramet...
research
02/28/2017

Deep Semi-Random Features for Nonlinear Function Approximation

We propose semi-random features for nonlinear function approximation. Th...

Please sign up or login with your details

Forgot password? Click here to reset