Statistical Guarantees for Transformation Based Models with Applications to Implicit Variational Inference

10/23/2020
by   Sean Plummer, et al.
12

Transformation-based methods have been an attractive approach in non-parametric inference for problems such as unconditional and conditional density estimation due to their unique hierarchical structure that models the data as flexible transformation of a set of common latent variables. More recently, transformation-based models have been used in variational inference (VI) to construct flexible implicit families of variational distributions. However, their use in both non-parametric inference and variational inference lacks theoretical justification. We provide theoretical justification for the use of non-linear latent variable models (NL-LVMs) in non-parametric inference by showing that the support of the transformation induced prior in the space of densities is sufficiently large in the L_1 sense. We also show that, when a Gaussian process (GP) prior is placed on the transformation function, the posterior concentrates at the optimal rate up to a logarithmic factor. Adopting the flexibility demonstrated in the non-parametric setting, we use the NL-LVM to construct an implicit family of variational distributions, deemed GP-IVI. We delineate sufficient conditions under which GP-IVI achieves optimal risk bounds and approximates the true posterior in the sense of the Kullback-Leibler divergence. To the best of our knowledge, this is the first work on providing theoretical guarantees for implicit variational inference.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/25/2020

Stein Variational Gaussian Processes

We show how to use Stein variational gradient descent (SVGD) to carry ou...
research
05/29/2017

Implicit Variational Inference with Kernel Density Ratio Fitting

Recent progress in variational inference has paid much attention to the ...
research
08/23/2020

Blindness of score-based methods to isolated components and mixing proportions

A large family of score-based methods are developed recently to solve un...
research
02/28/2017

Hierarchical Implicit Models and Likelihood-Free Variational Inference

Implicit probabilistic models are a flexible class of models defined by ...
research
05/18/2020

Deep Latent-Variable Kernel Learning

Deep kernel learning (DKL) leverages the connection between Gaussian pro...
research
02/26/2020

ICE-BeeM: Identifiable Conditional Energy-Based Deep Models

Despite the growing popularity of energy-based models, their identifiabi...
research
07/06/2022

Gradient-Free Kernel Stein Discrepancy

Stein discrepancies have emerged as a powerful statistical tool, being a...

Please sign up or login with your details

Forgot password? Click here to reset