Frequency Bias in Neural Networks for Input of Non-Uniform Density

03/10/2020
by   Ronen Basri, et al.
10

Recent works have partly attributed the generalization ability of over-parameterized neural networks to frequency bias – networks trained with gradient descent on data drawn from a uniform distribution find a low frequency fit before high frequency ones. As realistic training sets are not drawn from a uniform distribution, we here use the Neural Tangent Kernel (NTK) model to explore the effect of variable density on training dynamics. Our results, which combine analytic and empirical observations, show that when learning a pure harmonic function of frequency κ, convergence at a point ∈^d-1 occurs in time O(κ^d/p()) where p() denotes the local density at . Specifically, for data in ^1 we analytically derive the eigenfunctions of the kernel associated with the NTK for two-layer networks. We further prove convergence results for deep, fully connected networks with respect to the spectral decomposition of the NTK. Our empirical study highlights similarities and differences between deep and shallow networks in this model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/02/2019

The Convergence Rate of Neural Networks for Learned Functions of Different Frequencies

We study the relationship between the speed at which a neural network le...
research
05/28/2022

A Quadrature Perspective on Frequency Bias in Neural Network Training with Nonuniform Data

Small generalization errors of over-parameterized neural networks (NNs) ...
research
11/06/2021

Understanding Layer-wise Contributions in Deep Neural Networks through Spectral Analysis

Spectral analysis is a powerful tool, decomposing any function into simp...
research
01/14/2023

Understanding the Spectral Bias of Coordinate Based MLPs Via Training Dynamics

Recently, multi-layer perceptrons (MLPs) with ReLU activations have enab...
research
12/03/2019

Towards Understanding the Spectral Bias of Deep Learning

An intriguing phenomenon observed during training neural networks is the...
research
06/01/2021

The Gaussian equivalence of generative models for learning with shallow neural networks

Understanding the impact of data structure on the computational tractabi...
research
09/16/2022

Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a Polynomial Net Study

Neural tangent kernel (NTK) is a powerful tool to analyze training dynam...

Please sign up or login with your details

Forgot password? Click here to reset