On the validity of kernel approximations for orthogonally-initialized neural networks

04/13/2021
by   James Martens, et al.
0

In this note we extend kernel function approximation results for neural networks with Gaussian-distributed weights to single-layer networks initialized using Haar-distributed random orthogonal matrices (with possible rescaling). This is accomplished using recent results from random matrix theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/28/2016

Orthogonal Random Features

We present an intriguing discovery related to Random Fourier Features: i...
research
05/13/2019

Spectral Analysis of Kernel and Neural Embeddings: Optimization and Generalization

We extend the recent results of (Arora et al., 2019) by a spectral analy...
research
01/12/2022

Eigenvalue Distribution of Large Random Matrices Arising in Deep Neural Networks: Orthogonal Case

The paper deals with the distribution of singular values of the input-ou...
research
06/12/2021

What can linearized neural networks actually say about generalization?

For certain infinitely-wide neural networks, the neural tangent kernel (...
research
08/30/2021

A fast point solver for deep nonlinear function approximators

Deep kernel processes (DKPs) generalise Bayesian neural networks, but do...
research
04/12/2022

Local Random Feature Approximations of the Gaussian Kernel

A fundamental drawback of kernel-based statistical models is their limit...
research
05/11/2021

Analysis of One-Hidden-Layer Neural Networks via the Resolvent Method

We compute the asymptotic empirical spectral distribution of a non-linea...

Please sign up or login with your details

Forgot password? Click here to reset