Weighted Optimization: better generalization by smoother interpolation

06/15/2020
by   Yuege Xie, et al.
0

We provide a rigorous analysis of how implicit bias towards smooth interpolations leads to low generalization error in the overparameterized setting. We provide the first case study of this connection through a random Fourier series model and weighted least squares. We then argue through this model and numerical experiments that normalization methods in deep learning such as weight normalization improve generalization in overparameterized neural networks by implicitly encouraging smooth interpolants.

READ FULL TEXT
research
08/07/2020

Generalization error of minimum weighted norm and kernel interpolation

We study the generalization error of functions that interpolate prescrib...
research
01/06/2022

Grokking: Generalization Beyond Overfitting on Small Algorithmic Datasets

In this paper we propose to study generalization of neural networks on s...
research
05/09/2023

Robust Implicit Regularization via Weight Normalization

Overparameterized models may have many interpolating solutions; implicit...
research
03/09/2022

On the influence of over-parameterization in manifold based surrogates and deep neural operators

Constructing accurate and generalizable approximators for complex physic...
research
06/14/2022

Understanding the Generalization Benefit of Normalization Layers: Sharpness Reduction

Normalization layers (e.g., Batch Normalization, Layer Normalization) we...
research
10/05/2020

Smaller generalization error derived for deep compared to shallow residual neural networks

Estimates of the generalization error are proved for a residual neural n...
research
01/23/2022

A Generalized Weighted Optimization Method for Computational Learning and Inversion

The generalization capacity of various machine learning models exhibits ...

Please sign up or login with your details

Forgot password? Click here to reset