DeepAI
Log In Sign Up

Learning and Generalization in Overparameterized Normalizing Flows

06/19/2021
by   Kulin Shah, et al.
0

In supervised learning, it is known that overparameterized neural networks with one hidden layer provably and efficiently learn and generalize, when trained using stochastic gradient descent with sufficiently small learning rate and suitable initialization. In contrast, the benefit of overparameterization in unsupervised learning is not well understood. Normalizing flows (NFs) constitute an important class of models in unsupervised learning for sampling and density estimation. In this paper, we theoretically and empirically analyze these models when the underlying neural network is one-hidden-layer overparameterized network. Our main contributions are two-fold: (1) On the one hand, we provide theoretical and empirical evidence that for a class of NFs containing most of the existing NF models, overparametrization hurts training. (2) On the other hand, we prove that unconstrained NFs, a recently introduced model, can efficiently learn any reasonable data distribution under minimal assumptions when the underlying network is overparametrized.

READ FULL TEXT

page 1

page 2

page 3

page 4

08/03/2018

Learning Overparameterized Neural Networks via Stochastic Gradient Descent on Structured Data

Neural networks have many successful applications, while much less theor...
01/04/2021

Provable Generalization of SGD-trained Neural Networks of Any Width in the Presence of Adversarial Label Noise

We consider a one-hidden-layer leaky ReLU network of arbitrary width tra...
02/07/2018

Learning One Convolutional Layer with Overlapping Patches

We give the first provably efficient algorithm for learning a one hidden...
02/13/2018

Towards Understanding the Generalization Bias of Two Layer Convolutional Linear Classifiers with Gradient Descent

A major challenge in understanding the generalization of deep learning i...
12/14/2020

A Perturbation Resilient Framework for Unsupervised Learning

Designing learning algorithms that are resistant to perturbations of the...
01/04/2021

Multi-Model Least Squares-Based Recomputation Framework for Large Data Analysis

Most multilayer least squares (LS)-based neural networks are structured ...
03/06/2019

Neural Empirical Bayes

We formulate a novel framework that unifies kernel density estimation an...