On the non-universality of deep learning: quantifying the cost of symmetry

08/05/2022
by   Emmanuel Abbe, et al.
0

We prove computational limitations for learning with neural networks trained by noisy gradient descent (GD). Our result applies whenever GD training is equivariant (true for many standard architectures), and quantifies the alignment needed between architectures and data in order for GD to learn. As applications, (i) we characterize the functions that fully-connected networks can weak-learn on the binary hypercube and unit sphere, demonstrating that depth-2 is as powerful as any other depth for this task; (ii) we extend the merged-staircase necessity result for learning with latent low-dimensional structure [ABM22] to beyond the mean-field regime. Our techniques extend to stochastic gradient descent (SGD), for which we show nontrivial hardness results for learning with fully-connected networks, based on cryptographic assumptions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/13/2020

Balancedness and Alignment are Unlikely in Linear Neural Networks

We study the invariance properties of alignment in linear neural network...
research
09/09/2023

Approximation Results for Gradient Descent trained Neural Networks

The paper contains approximation guarantees for neural networks that are...
research
07/29/2021

Deep Networks Provably Classify Data on Curves

Data with low-dimensional nonlinear structure are ubiquitous in engineer...
research
04/13/2019

AutoEncoders for Training Compact Deep Learning RF Classifiers for Wireless Protocols

We show that compact fully connected (FC) deep learning networks trained...
research
02/21/2023

SGD learning on neural networks: leap complexity and saddle-to-saddle dynamics

We investigate the time complexity of SGD learning on fully-connected ne...
research
08/25/2020

Deep Networks and the Multiple Manifold Problem

We study the multiple manifold problem, a binary classification task mod...

Please sign up or login with your details

Forgot password? Click here to reset