Efficient NTK using Dimensionality Reduction

10/10/2022
by   Nir Ailon, et al.
0

Recently, neural tangent kernel (NTK) has been used to explain the dynamics of learning parameters of neural networks, at the large width limit. Quantitative analyses of NTK give rise to network widths that are often impractical and incur high costs in time and energy in both training and deployment. Using a matrix factorization technique, we show how to obtain similar guarantees to those obtained by a prior analysis while reducing training and inference resource costs. The importance of our result further increases when the input points' data dimension is in the same order as the number of input points. More generally, our work suggests how to analyze large width networks in which dense linear layers are replaced with a low complexity factorization, thus reducing the heavy dependence on the large width.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2022

Fast and Low-Memory Deep Neural Networks Using Binary Matrix Factorization

Despite the outstanding performance of deep neural networks in different...
research
11/29/2022

Infinite-width limit of deep linear neural networks

This paper studies the infinite-width limit of deep linear neural networ...
research
09/27/2019

In-training Matrix Factorization for Parameter-frugal Neural Machine Translation

In this paper, we propose the use of in-training matrix factorization to...
research
02/01/2022

Neural Tangent Kernel Beyond the Infinite-Width Limit: Effects of Depth and Initialization

Neural Tangent Kernel (NTK) is widely used to analyze overparametrized n...
research
05/10/2021

Exploiting Elasticity in Tensor Ranks for Compressing Neural Networks

Elasticities in depth, width, kernel size and resolution have been explo...
research
06/15/2020

Feature Space Saturation during Training

We propose layer saturation - a simple, online-computable method for ana...
research
04/24/2021

Width Transfer: On the (In)variance of Width Optimization

Optimizing the channel counts for different layers of a CNN has shown gr...

Please sign up or login with your details

Forgot password? Click here to reset