Analytic Study of Families of Spurious Minima in Two-Layer ReLU Neural Networks

07/21/2021
by   Yossi Arjevani, et al.
0

We study the optimization problem associated with fitting two-layer ReLU neural networks with respect to the squared loss, where labels are generated by a target network. We make use of the rich symmetry structure to develop a novel set of tools for studying families of spurious minima. In contrast to existing approaches which operate in limiting regimes, our technique directly addresses the nonconvex loss landscape for a finite number of inputs d and neurons k, and provides analytic, rather than heuristic, information. In particular, we derive analytic estimates for the loss at different minima, and prove that modulo O(d^-1/2)-terms the Hessian spectrum concentrates near small positive constants, with the exception of Θ(d) eigenvalues which grow linearly with d. We further show that the Hessian spectrum at global and spurious minima coincide to O(d^-1/2)-order, thus challenging our ability to argue about statistical generalization through local curvature. Lastly, our technique provides the exact fractional dimensionality at which families of critical points turn from saddles into spurious minima. This makes possible the study of the creation and the annihilation of spurious minima using powerful tools from equivariant bifurcation theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2022

Annihilation of Spurious Minima in Two-Layer ReLU Networks

We study the optimization problem associated with fitting two-layer ReLU...
research
08/04/2020

Analytic Characterization of the Hessian in Shallow ReLU Models: A Tale of Symmetry

We consider the optimization problem associated with fitting two-layers ...
research
06/13/2023

Symmetry Critical Points for Symmetric Tensor Decomposition Problems

We consider the non-convex optimization problem associated with the deco...
research
12/26/2019

Spurious Local Minima of Shallow ReLU Networks Conform with the Symmetry of the Target Model

We consider the optimization problem associated with fitting two-layer R...
research
07/05/2019

Weight-space symmetry in deep networks gives rise to permutation saddles, connected by equal-loss valleys across the loss landscape

The permutation symmetry of neurons in each layer of a deep neural netwo...
research
06/12/2023

Unveiling the Hessian's Connection to the Decision Boundary

Understanding the properties of well-generalizing minima is at the heart...
research
12/24/2017

Spurious Local Minima are Common in Two-Layer ReLU Neural Networks

We consider the optimization problem associated with training simple ReL...

Please sign up or login with your details

Forgot password? Click here to reset