Sample Complexity of Probability Divergences under Group Symmetry

02/03/2023
by   Ziyu Chen, et al.
0

We rigorously quantify the improvement in the sample complexity of variational divergence estimations for group-invariant distributions. In the cases of the Wasserstein-1 metric and the Lipschitz-regularized α-divergences, the reduction of sample complexity is proportional to an ambient-dimension-dependent power of the group size. For the maximum mean discrepancy (MMD), the improvement of sample complexity is more nuanced, as it depends on not only the group size but also the choice of kernel. Numerical simulations verify our theories.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/24/2023

The Exact Sample Complexity Gain from Invariances for Kernel Regression on Manifolds

In practice, encoding invariances into models helps sample complexity. I...
research
05/22/2023

Statistical Guarantees of Group-Invariant GANs

Group-invariant generative adversarial networks (GANs) are a type of GAN...
research
10/05/2018

Sample Complexity of Sinkhorn divergences

Optimal transport (OT) and maximum mean discrepancies (MMD) are now rout...
research
06/14/2021

On the Sample Complexity of Learning with Geometric Stability

Many supervised learning problems involve high-dimensional data such as ...
research
03/07/2023

Group conditional validity via multi-group learning

We consider the problem of distribution-free conformal prediction and th...
research
01/17/2018

An Empirical Analysis of Proximal Policy Optimization with Kronecker-factored Natural Gradients

In this technical report, we consider an approach that combines the PPO ...
research
08/14/2020

On the Sample Complexity of Super-Resolution Radar

We point out an issue with Lemma 8.6 of [1]. This lemma specifies the re...

Please sign up or login with your details

Forgot password? Click here to reset