Kernel Stein Generative Modeling

07/06/2020
by   Wei-Cheng Chang, et al.
11

We are interested in gradient-based Explicit Generative Modeling where samples can be derived from iterative gradient updates based on an estimate of the score function of the data distribution. Recent advances in Stochastic Gradient Langevin Dynamics (SGLD) demonstrates impressive results with energy-based models on high-dimensional and complex data distributions. Stein Variational Gradient Descent (SVGD) is a deterministic sampling algorithm that iteratively transports a set of particles to approximate a given distribution, based on functional gradient descent that decreases the KL divergence. SVGD has promising results on several Bayesian inference applications. However, applying SVGD on high dimensional problems is still under-explored. The goal of this work is to study high dimensional inference with SVGD. We first identify key challenges in practical kernel SVGD inference in high-dimension. We propose noise conditional kernel SVGD (NCK-SVGD), that works in tandem with the recently introduced Noise Conditional Score Network estimator. NCK is crucial for successful inference with SVGD in high dimension, as it adapts the kernel to the noise level of the score estimate. As we anneal the noise, NCK-SVGD targets the real data distribution. We then extend the annealed SVGD with an entropic regularization. We show that this offers a flexible control between sample quality and diversity, and verify it empirically by precision and recall evaluations. The NCK-SVGD produces samples comparable to GANs and annealed SGLD on computer vision benchmarks, including MNIST and CIFAR-10.

READ FULL TEXT

page 7

page 8

page 12

page 14

research
08/16/2016

Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm

We propose a general purpose variational inference algorithm that forms ...
research
11/13/2017

Analyzing and Improving Stein Variational Gradient Descent for High-dimensional Marginal Inference

Stein variational gradient descent (SVGD) is a nonparametric inference m...
research
10/08/2019

Credible Sample Elicitation by Deep Learning, for Deep Learning

It is important to collect credible training samples (x,y) for building ...
research
04/22/2020

Stabilizing Training of Generative Adversarial Nets via Langevin Stein Variational Gradient Descent

Generative adversarial networks (GANs), famous for the capability of lea...
research
07/12/2019

Generative Modeling by Estimating Gradients of the Data Distribution

We introduce a new generative model where samples are produced via Lange...
research
06/12/2020

Approximate Inference for Spectral Mixture Kernel

A spectral mixture (SM) kernel is a flexible kernel used to model any st...
research
04/19/2022

A stochastic Stein Variational Newton method

Stein variational gradient descent (SVGD) is a general-purpose optimizat...

Please sign up or login with your details

Forgot password? Click here to reset