Subadditivity of Probability Divergences on Bayes-Nets with Applications to Time Series GANs

03/02/2020
by   Mucong Ding, et al.
0

GANs for time series data often use sliding windows or self-attention to capture underlying time dependencies. While these techniques have no clear theoretical justification, they are successful in significantly reducing the discriminator size, speeding up the training process, and improving the generation quality. In this paper, we provide both theoretical foundations and a practical framework of GANs for high-dimensional distributions with conditional independence structure captured by a Bayesian network, such as time series data. We prove that several probability divergences satisfy subadditivity properties with respect to the neighborhoods of the Bayes-net graph, providing an upper bound on the distance between two Bayes-nets by the sum of (local) distances between their marginals on every neighborhood of the graph. This leads to our proposed Subadditive GAN framework that uses a set of simple discriminators on the neighborhoods of the Bayes-net, rather than a giant discriminator on the entire network, providing significant statistical and computational benefits. We show that several probability distances including Jensen-Shannon, Total Variation, and Wasserstein, have subadditivity or generalized subadditivity. Moreover, we prove that Integral Probability Metrics (IPMs), which encompass commonly-used loss functions in GANs, also enjoy a notion of subadditivity under some mild conditions. Furthermore, we prove that nearly all f-divergences satisfy local subadditivity in which subadditivity holds when the distributions are relatively close. Our experiments on synthetic as well as real-world datasets verify the proposed theory and the benefits of subadditive GANs.

READ FULL TEXT

page 33

page 34

research
06/09/2020

Conditional Sig-Wasserstein GANs for Time Series Generation

Generative adversarial networks (GANs) have been extremely successful in...
research
08/02/2021

PSA-GAN: Progressive Self Attention GANs for Synthetic Time Series

Realistic synthetic time series data of sufficient length enables practi...
research
09/26/2020

Decision-Aware Conditional GANs for Time Series Data

We introduce the decision-aware time-series conditional generative adver...
research
09/09/2023

TCGAN: Convolutional Generative Adversarial Network for Time Series Classification and Clustering

Recent works have demonstrated the superiority of supervised Convolution...
research
11/05/2020

Statistical analysis of Wasserstein GANs with applications to time series forecasting

We provide statistical theory for conditional and unconditional Wasserst...
research
06/04/2020

Some Theoretical Insights into Wasserstein GANs

Generative Adversarial Networks (GANs) have been successful in producing...
research
03/27/2013

The structure of Bayes nets for vision recognition

This paper is part of a study whose goal is to show the effciency of usi...

Please sign up or login with your details

Forgot password? Click here to reset