Lipschitz widths

11/02/2021
by   Guergana Petrova, et al.
0

This paper introduces a measure, called Lipschitz widths, of the optimal performance possible of certain nonlinear methods of approximation. It discusses their relation to entropy numbers and other well known widths such as the Kolmogorov and the stable manifold widths. It also shows that the Lipschitz widths provide a theoretical benchmark for the approximation quality achieved via deep neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/29/2020

Lipschitz neural networks are dense in the set of all Lipschitz functions

This note shows that, for a fixed Lipschitz constant L > 0, one layer ne...
research
09/21/2020

Optimal Stable Nonlinear Approximation

While it is well known that nonlinear methods of approximation can often...
research
02/15/2022

On the entropy numbers and the Kolmogorov widths

Direct estimates between linear or nonlinear Kolmogorov widths and entro...
research
08/08/2022

A Theoretical View on Sparsely Activated Networks

Deep and wide neural networks successfully fit very complex functions to...
research
06/04/2021

Can convolutional ResNets approximately preserve input distances? A frequency analysis perspective

ResNets constrained to be bi-Lipschitz, that is, approximately distance ...
research
01/27/2023

Direct Parameterization of Lipschitz-Bounded Deep Networks

This paper introduces a new parameterization of deep neural networks (bo...
research
03/20/2019

Box-constrained monotone L_∞-approximations and Lipschitz-continuous regularized functions

Let f:[0,1]→[0,1] be a nondecreasing function. The main goal of this wor...

Please sign up or login with your details

Forgot password? Click here to reset