On the Learning Dynamics of Two-layer Nonlinear Convolutional Neural Networks

05/24/2019
by   Bing Yu, et al.
0

Convolutional neural networks (CNNs) have achieved remarkable performance in various fields, particularly in the domain of computer vision. However, why this architecture works well remains to be a mystery. In this work we move a small step toward understanding the success of CNNs by investigating the learning dynamics of a two-layer nonlinear convolutional neural network over some specific data distributions. Rather than the typical Gaussian assumption for input data distribution, we consider a more realistic setting that each data point (e.g. image) contains a specific pattern determining its class label. Within this setting, we both theoretically and empirically show that some convolutional filters will learn the key patterns in data and the norm of these filters will dominate during the training process with stochastic gradient descent. And with any high probability, when the number of iterations is sufficiently large, the CNN model could obtain 100 considered data distributions. Our experiments demonstrate that for practical image classification tasks our findings still hold to some extent.

READ FULL TEXT
research
01/05/2018

Learning 3D-FilterMap for Deep Convolutional Neural Networks

We present a novel and compact architecture for deep Convolutional Neura...
research
02/14/2022

Analytic Learning of Convolutional Neural Network For Pattern Recognition

Training convolutional neural networks (CNNs) with back-propagation (BP)...
research
07/15/2020

Focus-and-Expand: Training Guidance Through Gradual Manipulation of Input Features

We present a simple and intuitive Focus-and-eXpand () method to guide th...
research
08/26/2018

DeepTracker: Visualizing the Training Process of Convolutional Neural Networks

Deep convolutional neural networks (CNNs) have achieved remarkable succe...
research
06/02/2022

Understanding the Role of Nonlinearity in Training Dynamics of Contrastive Learning

While the empirical success of self-supervised learning (SSL) heavily re...
research
05/26/2015

Accelerating Very Deep Convolutional Networks for Classification and Detection

This paper aims to accelerate the test-time computation of convolutional...
research
05/21/2021

Distinguishing artefacts: evaluating the saturation point of convolutional neural networks

Prior work has shown Convolutional Neural Networks (CNNs) trained on sur...

Please sign up or login with your details

Forgot password? Click here to reset