Importance weighted generative networks

06/07/2018
by   Maurice Diesendruck, et al.
2

Deep generative networks can simulate from a complex target distribution, by minimizing a loss with respect to samples from that distribution. However, often we do not have direct access to our target distribution - our data may be subject to sample selection bias, or may be from a different but related distribution. We present methods based on importance weighting that can estimate the loss with respect to a target distribution, even if we cannot access that distribution directly, in a variety of settings. These estimators, which differentially weight the contribution of data to the loss function, offer both theoretical guarantees and impressive empirical performance.

READ FULL TEXT

page 8

page 9

page 14

page 15

page 16

page 17

research
04/19/2018

Effects of sampling skewness of the importance-weighted risk estimator on model selection

Importance-weighting is a popular and well-researched technique for deal...
research
06/08/2017

Generative Autotransporters

In this paper, we aim to introduce the classic Optimal Transport theory ...
research
06/23/2019

Bias Correction of Learned Generative Models using Likelihood-Free Importance Weighting

A learned generative model often produces biased statistics relative to ...
research
07/01/2021

Mandoline: Model Evaluation under Distribution Shift

Machine learning models are often deployed in different settings than th...
research
03/28/2019

Convergence rates for optimised adaptive importance samplers

Adaptive importance samplers are adaptive Monte Carlo algorithms to esti...
research
06/08/2020

Rethinking Importance Weighting for Deep Learning under Distribution Shift

Under distribution shift (DS) where the training data distribution diffe...
research
06/19/2023

Correcting Underrepresentation and Intersectional Bias for Fair Classification

We consider the problem of learning from data corrupted by underrepresen...

Please sign up or login with your details

Forgot password? Click here to reset