DeepAI AI Chat
Log In Sign Up

Copula & Marginal Flows: Disentangling the Marginal from its Joint

by   Magnus Wiese, et al.
Technische Universität Kaiserslautern

Deep generative networks such as GANs and normalizing flows flourish in the context of high-dimensional tasks such as image generation. However, so far exact modeling or extrapolation of distributional properties such as the tail asymptotics generated by a generative network is not available. In this paper, we address this issue for the first time in the deep learning literature by making two novel contributions. First, we derive upper bounds for the tails that can be expressed by a generative network and demonstrate Lp-space related properties. There we show specifically that in various situations an optimal generative network does not exist. Second, we introduce and propose copula and marginal generative flows (CM flows) which allow for an exact modeling of the tail and any prior assumption on the CDF up to an approximation of the uniform distribution. Our numerical results support the use of CM flows.


page 3

page 8

page 12


Marginal Tail-Adaptive Normalizing Flows

Learning the tail behavior of a distribution is a notoriously difficult ...

COMET Flows: Towards Generative Modeling of Multivariate Extremes and Tail Dependence

Normalizing flows, a popular class of deep generative models, often fail...

Universal Approximation of Residual Flows in Maximum Mean Discrepancy

Normalizing flows are a class of flexible deep generative models that of...

Fat-Tailed Variational Inference with Anisotropic Tail Adaptive Flows

While fat-tailed densities commonly arise as posterior and marginal dist...

Flows Succeed Where GANs Fail: Lessons from Low-Dimensional Data

Normalizing flows and generative adversarial networks (GANs) are both ap...

CaloFlow: Fast and Accurate Generation of Calorimeter Showers with Normalizing Flows

We introduce CaloFlow, a fast detector simulation framework based on nor...