Spectral Analysis of Latent Representations

07/19/2019
by   Justin Shenk, et al.
4

We propose a metric, Layer Saturation, defined as the proportion of the number of eigenvalues needed to explain 99 representations, for analyzing the learned representations of neural network layers. Saturation is based on spectral analysis and can be computed efficiently, making live analysis of the representations practical during training. We provide an outlook for future applications of this metric by outlining the behaviour of layer saturation in different neural architectures and problems. We further show that saturation is related to the generalization and predictive performance of neural networks.

READ FULL TEXT

page 1

page 8

page 12

page 13

research
05/13/2019

Spectral Analysis of Kernel and Neural Embeddings: Optimization and Generalization

We extend the recent results of (Arora et al., 2019) by a spectral analy...
research
07/24/2019

A Fine-Grained Spectral Perspective on Neural Networks

Are neural networks biased toward simple functions? Does depth always he...
research
11/06/2021

Understanding Layer-wise Contributions in Deep Neural Networks through Spectral Analysis

Spectral analysis is a powerful tool, decomposing any function into simp...
research
03/20/2023

Model Stitching: Looking For Functional Similarity Between Representations

Model stitching (Lenc Vedaldi 2015) is a compelling methodology to c...
research
10/05/2015

Nonlinear Spectral Analysis via One-homogeneous Functionals - Overview and Future Prospects

We present in this paper the motivation and theory of nonlinear spectral...
research
10/12/2022

GULP: a prediction-based metric between representations

Comparing the representations learned by different neural networks has r...

Please sign up or login with your details

Forgot password? Click here to reset