A Quadrature Perspective on Frequency Bias in Neural Network Training with Nonuniform Data

05/28/2022
by   Annan Yu, et al.
0

Small generalization errors of over-parameterized neural networks (NNs) can be partially explained by the frequency biasing phenomenon, where gradient-based algorithms minimize the low-frequency misfit before reducing the high-frequency residuals. Using the Neural Tangent Kernel (NTK), one can provide a theoretically rigorous analysis for training where data are drawn from constant or piecewise-constant probability densities. Since most training data sets are not drawn from such distributions, we use the NTK model and a data-dependent quadrature rule to theoretically quantify the frequency biasing of NN training given fully nonuniform data. By replacing the loss function with a carefully selected Sobolev norm, we can further amplify, dampen, counterbalance, or reverse the intrinsic frequency biasing in NN training.

READ FULL TEXT
research
03/10/2020

Frequency Bias in Neural Networks for Input of Non-Uniform Density

Recent works have partly attributed the generalization ability of over-p...
research
05/25/2021

An Upper Limit of Decaying Rate with Respect to Frequency in Deep Neural Network

Deep neural network (DNN) usually learns the target function from low to...
research
05/01/2023

Towards a Phenomenological Understanding of Neural Networks: Data

A theory of neural networks (NNs) built upon collective variables would ...
research
01/22/2020

Neural Networks in Evolutionary Dynamic Constrained Optimization: Computational Cost and Benefits

Neural networks (NN) have been recently applied together with evolutiona...
research
03/04/2020

Foreground model recognition through Neural Networks for CMB B-mode observations

In this work we present a Neural Network (NN) algorithm for the identifi...
research
05/20/2023

Loss Spike in Training Neural Networks

In this work, we study the mechanism underlying loss spikes observed dur...
research
01/30/2021

Linear Frequency Principle Model to Understand the Absence of Overfitting in Neural Networks

Why heavily parameterized neural networks (NNs) do not overfit the data ...

Please sign up or login with your details

Forgot password? Click here to reset