Conditional Sampling With Monotone GANs

06/11/2020
by   Nikola Kovachki, et al.
0

We present a new approach for sampling conditional measures that enables uncertainty quantification in supervised learning tasks. We construct a mapping that transforms a reference measure to the probability measure of the output conditioned on new inputs. The mapping is trained via a modification of generative adversarial networks (GANs), called monotone GANs, that imposes monotonicity constraints and a block triangular structure. We present theoretical results, in an idealized setting, that support our proposed method as well as numerical experiments demonstrating the ability of our method to sample the correct conditional measures in applications ranging from inverse problems to image in-painting.

READ FULL TEXT

page 8

page 17

page 20

research
07/06/2020

Partially Conditioned Generative Adversarial Networks

Generative models are undoubtedly a hot topic in Artificial Intelligence...
research
11/19/2016

Invertible Conditional GANs for image editing

Generative Adversarial Networks (GANs) have recently demonstrated to suc...
research
06/17/2017

Bayesian Conditional Generative Adverserial Networks

Traditional GANs use a deterministic generator function (typically a neu...
research
01/23/2020

Information Compensation for Deep Conditional Generative Networks

In recent years, unsupervised/weakly-supervised conditional generative a...
research
08/06/2018

X-GANs: Image Reconstruction Made Easy for Extreme Cases

Image reconstruction including image restoration and denoising is a chal...
research
06/03/2020

Double Generative Adversarial Networks for Conditional Independence Testing

In this article, we consider the problem of high-dimensional conditional...
research
05/10/2023

Supervised learning with probabilistic morphisms and kernel mean embeddings

In this paper I propose a concept of a correct loss function in a genera...

Please sign up or login with your details

Forgot password? Click here to reset