The Restricted Isometry of ReLU Networks: Generalization through Norm Concentration

07/01/2020
by   Alex Goeßmann, et al.
0

While regression tasks aim at interpolating a relation on the entire input space, they often have to be solved with a limited amount of training data. Still, if the hypothesis functions can be sketched well with the data, one can hope for identifying a generalizing model. In this work, we introduce with the Neural Restricted Isometry Property (NeuRIP) a uniform concentration event, in which all shallow ReLU networks are sketched with the same quality. To derive the sample complexity for achieving NeuRIP, we bound the covering numbers of the networks in the Sub-Gaussian metric and apply chaining techniques. In case of the NeuRIP event, we then provide bounds on the expected risk, which hold for networks in any sublevel set of the empirical risk. We conclude that all networks with sufficiently small empirical risk generalize uniformly.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/06/2018

Concentration bounds for empirical conditional value-at-risk: The unbounded case

In several real-world applications involving decision making under uncer...
research
05/09/2023

Testing for Overfitting

High complexity models are notorious in machine learning for overfitting...
research
05/24/2022

Approximation speed of quantized vs. unquantized ReLU neural networks and beyond

We consider general approximation families encompassing ReLU neural netw...
research
02/27/2019

Improved Concentration Bounds for Conditional Value-at-Risk and Cumulative Prospect Theory using Wasserstein distance

Known finite-sample concentration bounds for the Wasserstein distance be...
research
08/22/2016

Uniform Generalization, Concentration, and Adaptive Learning

One fundamental goal in any learning algorithm is to mitigate its risk f...
research
02/04/2021

Concentration of Non-Isotropic Random Tensors with Applications to Learning and Empirical Risk Minimization

Dimension is an inherent bottleneck to some modern learning tasks, where...
research
10/16/2020

Quantile regression with ReLU Networks: Estimators and minimax rates

Quantile regression is the task of estimating a specified percentile res...

Please sign up or login with your details

Forgot password? Click here to reset