Deep Networks Provably Classify Data on Curves

07/29/2021
by   Tingran Wang, et al.
6

Data with low-dimensional nonlinear structure are ubiquitous in engineering and scientific problems. We study a model problem with such structure – a binary classification task that uses a deep fully-connected neural network to classify data drawn from two disjoint smooth curves on the unit sphere. Aside from mild regularity conditions, we place no restrictions on the configuration of the curves. We prove that when (i) the network depth is large relative to certain geometric properties that set the difficulty of the problem and (ii) the network width and number of samples is polynomial in the depth, randomly-initialized gradient descent quickly learns to correctly classify all points on the two curves with high probability. To our knowledge, this is the first generalization guarantee for deep networks with nonlinear data that depends only on intrinsic data properties. Our analysis proceeds by a reduction to dynamics in the neural tangent kernel (NTK) regime, where the network depth plays the role of a fitting resource in solving the classification problem. In particular, via fine-grained control of the decay properties of the NTK, we demonstrate that when the network is sufficiently deep, the NTK can be locally approximated by a translationally invariant operator on the manifolds and stably inverted over smooth functions, which guarantees convergence and generalization.

READ FULL TEXT
research
08/25/2020

Deep Networks and the Multiple Manifold Problem

We study the multiple manifold problem, a binary classification task mod...
research
08/05/2022

On the non-universality of deep learning: quantifying the cost of symmetry

We prove computational limitations for learning with neural networks tra...
research
09/09/2023

Approximation Results for Gradient Descent trained Neural Networks

The paper contains approximation guarantees for neural networks that are...
research
09/30/2020

Deep Equals Shallow for ReLU Networks in Kernel Regimes

Deep networks are often considered to be more expressive than shallow on...
research
06/12/2019

Learning Curves for Deep Neural Networks: A Gaussian Field Theory Perspective

A series of recent works suggest that deep neural networks (DNNs), of fi...
research
01/20/2020

Any Target Function Exists in a Neighborhood of Any Sufficiently Wide Random Network: A Geometrical Perspective

It is known that any target function is realized in a sufficiently small...

Please sign up or login with your details

Forgot password? Click here to reset