Copula & Marginal Flows: Disentangling the Marginal from its Joint

07/07/2019
by   Magnus Wiese, et al.
2

Deep generative networks such as GANs and normalizing flows flourish in the context of high-dimensional tasks such as image generation. However, so far exact modeling or extrapolation of distributional properties such as the tail asymptotics generated by a generative network is not available. In this paper, we address this issue for the first time in the deep learning literature by making two novel contributions. First, we derive upper bounds for the tails that can be expressed by a generative network and demonstrate Lp-space related properties. There we show specifically that in various situations an optimal generative network does not exist. Second, we introduce and propose copula and marginal generative flows (CM flows) which allow for an exact modeling of the tail and any prior assumption on the CDF up to an approximation of the uniform distribution. Our numerical results support the use of CM flows.

READ FULL TEXT

page 3

page 8

page 12

research
06/21/2022

Marginal Tail-Adaptive Normalizing Flows

Learning the tail behavior of a distribution is a notoriously difficult ...
research
05/02/2022

COMET Flows: Towards Generative Modeling of Multivariate Extremes and Tail Dependence

Normalizing flows, a popular class of deep generative models, often fail...
research
10/28/2022

Towards prediction of turbulent flows at high Reynolds numbers using high performance computing data and deep learning

In this paper, deep learning (DL) methods are evaluated in the context o...
research
05/19/2023

Generative Sliced MMD Flows with Riesz Kernels

Maximum mean discrepancy (MMD) flows suffer from high computational cost...
research
07/06/2020

SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows

Normalizing flows and variational autoencoders are powerful generative m...
research
05/16/2022

Fat-Tailed Variational Inference with Anisotropic Tail Adaptive Flows

While fat-tailed densities commonly arise as posterior and marginal dist...
research
06/17/2020

Flows Succeed Where GANs Fail: Lessons from Low-Dimensional Data

Normalizing flows and generative adversarial networks (GANs) are both ap...

Please sign up or login with your details

Forgot password? Click here to reset