Spectral Analysis of the Neural Tangent Kernel for Deep Residual Networks

04/07/2021
by   Yuval Belfer, et al.
0

Deep residual network architectures have been shown to achieve superior accuracy over classical feed-forward networks, yet their success is still not fully understood. Focusing on massively over-parameterized, fully connected residual networks with ReLU activation through their respective neural tangent kernels (ResNTK), we provide here a spectral analysis of these kernels. Specifically, we show that, much like NTK for fully connected networks (FC-NTK), for input distributed uniformly on the hypersphere 𝕊^d-1, the eigenfunctions of ResNTK are the spherical harmonics and the eigenvalues decay polynomially with frequency k as k^-d. These in turn imply that the set of functions in their Reproducing Kernel Hilbert Space are identical to those of FC-NTK, and consequently also to those of the Laplace kernel. We further show, by drawing on the analogy to the Laplace kernel, that depending on the choice of a hyper-parameter that balances between the skip and residual connections ResNTK can either become spiky with depth, as with FC-NTK, or maintain a stable shape.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/03/2020

On the Similarity between the Laplace and Neural Tangent Kernels

Recent theoretical work has shown that massively overparameterized neura...
research
11/27/2022

A Kernel Perspective of Skip Connections in Convolutional Networks

Over-parameterized residual networks (ResNets) are amongst the most succ...
research
03/17/2022

On the Spectral Bias of Convolutional Neural Tangent and Gaussian Process Kernels

We study the properties of various over-parametrized convolutional neura...
research
09/22/2020

Deep Neural Tangent Kernel and Laplace Kernel Have the Same RKHS

We prove that the reproducing kernel Hilbert spaces (RKHS) of a deep neu...
research
01/28/2020

Residual Tangent Kernels

A recent body of work has focused on the theoretical study of neural net...
research
04/05/2023

Hybrid Zonotopes Exactly Represent ReLU Neural Networks

We show that hybrid zonotopes offer an equivalent representation of feed...
research
11/07/2018

Characterizing Well-behaved vs. Pathological Deep Neural Network Architectures

We introduce a principled approach, requiring only mild assumptions, for...

Please sign up or login with your details

Forgot password? Click here to reset