Deep frequency principle towards understanding why deeper learning is faster

07/28/2020
by   Zhi-Qin John Xu, et al.
0

Understanding the effect of depth in deep learning is a critical problem. In this work, we utilize the Fourier analysis to empirically provide a promising mechanism to understand why deeper learning is faster. To this end, we separate a deep neural network into two parts, one is a pre-condition component and the other is a learning component, in which the output of the pre-condition one is the input of the learning one. Based on experiments of deep networks and real dataset, we propose a deep frequency principle, that is, the effective target function for a deeper hidden layer has a bias towards a function with more low frequency during the training. Therefore, the learning component effectively learns a lower frequency function if the pre-condition component has more layers. Due to the well-studied frequency principle, i.e., deep neural networks learn lower frequency functions faster, the deep frequency principle provides a reasonable explanation to why deeper learning is faster. We believe these empirical studies would be valuable for future theoretical studies of the effect of depth in deep learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/04/2021

Frequency Principle in Deep Learning Beyond Gradient-descent-based Training

Frequency perspective recently makes progress in understanding deep lear...
research
01/19/2022

Overview frequency principle/spectral bias in deep learning

Understanding deep learning is increasingly emergent as it penetrates mo...
research
12/20/2014

Why does Deep Learning work? - A perspective from Group Theory

Why does Deep Learning work? What representations does it capture? How d...
research
04/08/2015

A Group Theoretic Perspective on Unsupervised Deep Learning

Why does Deep Learning work? What representations does it capture? How d...
research
05/26/2022

Embedding Principle in Depth for the Loss Landscape Analysis of Deep Neural Networks

Unraveling the general structure underlying the loss landscapes of deep ...
research
12/06/2020

Fourier-domain Variational Formulation and Its Well-posedness for Supervised Learning

A supervised learning problem is to find a function in a hypothesis func...
research
10/04/2018

The Dynamics of Differential Learning I: Information-Dynamics and Task Reachability

We study the topology of the space of learning tasks, which is critical ...

Please sign up or login with your details

Forgot password? Click here to reset