Generative Models as Distributions of Functions

02/09/2021
by   Emilien Dupont, et al.
0

Generative models are typically trained on grid-like data such as images. As a result, the size of these models usually scales directly with the underlying grid resolution. In this paper, we abandon discretized grids and instead parameterize individual data points by continuous functions. We then build generative models by learning distributions over such functions. By treating data points as functions, we can abstract away from the specific type of data we train on and construct models that scale independently of signal resolution and dimension. To train our model, we use an adversarial approach with a discriminator that acts directly on continuous signals. Through experiments on both images and 3D shapes, we demonstrate that our model can learn rich distributions of functions independently of data type and resolution.

READ FULL TEXT

page 2

page 6

page 7

page 8

page 14

page 15

page 20

page 21

10/31/2017

Flexible Prior Distributions for Deep Generative Models

We consider the problem of training generative models with deep neural n...
07/28/2017

Generator Reversal

We consider the problem of training generative models with deep neural n...
06/27/2019

Curriculum Learning for Deep Generative Models with Clustering

Training generative models like generative adversarial networks (GANs) a...
01/28/2022

From data to functa: Your data point is a function and you should treat it like one

It is common practice in deep learning to represent a measurement of the...
05/27/2019

Intrinsic Multi-scale Evaluation of Generative Models

Generative models are often used to sample high-dimensional data points ...
07/24/2020

Interpreting Spatially Infinite Generative Models

Traditional deep generative models of images and other spatial modalitie...
04/04/2019

Learning Implicit Generative Models by Matching Perceptual Features

Perceptual features (PFs) have been used with great success in tasks suc...