What Can ResNet Learn Efficiently, Going Beyond Kernels?

05/24/2019
by   Zeyuan Allen-Zhu, et al.
0

How can neural networks such as ResNet efficiently learn CIFAR-10 with test accuracy more than 96 methods, fall far behind? Can we more provide theoretical justifications for this gap? There is an influential line of work relating neural networks to kernels in the over-parameterized regime, proving that they can learn certain concept class that is also learnable by kernels, with similar test error. Yet, can we show neural networks provably learn some concept class better than kernels? We answer this positively in the PAC learning language. We prove neural networks can efficiently learn a notable class of functions, including those defined by three-layer residual networks with smooth activations, without any distributional assumption. At the same time, we prove there are simple functions in this class that the test error obtained by neural networks can be much smaller than any "generic" kernel method, including neural tangent kernels, conjugate kernels, etc. The main intuition is that multi-layer neural networks can implicitly perform hierarchal learning using different layers, which reduces the sample complexity comparing to "one-shot" learning algorithms such as kernel methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/13/2020

Backward Feature Correction: How Deep Learning Performs Deep Learning

How does a 110-layer ResNet learn a high-complexity classifier using rel...
research
07/03/2020

On the Similarity between the Laplace and Neural Tangent Kernels

Recent theoretical work has shown that massively overparameterized neura...
research
06/30/2022

Neural Networks can Learn Representations with Gradient Descent

Significant theoretical work has established that in specific regimes, n...
research
10/03/2019

Beyond Linearization: On Quadratic and Higher-Order Approximation of Wide Neural Networks

Recent theoretical work has established connections between over-paramet...
research
02/19/2021

On Approximation in Deep Convolutional Networks: a Kernel Perspective

The success of deep convolutional networks on on tasks involving high-di...
research
07/30/2020

Improving Sample Efficiency with Normalized RBF Kernels

In deep learning models, learning more with less data is becoming more i...
research
09/29/2021

On the Provable Generalization of Recurrent Neural Networks

Recurrent Neural Network (RNN) is a fundamental structure in deep learni...

Please sign up or login with your details

Forgot password? Click here to reset