Variational Measure Preserving Flows

05/25/2018
by   Yichuan Zhang, et al.
0

Probabilistic modelling is a general and elegant framework to capture the uncertainty, ambiguity and diversity of hidden structures in data. Probabilistic inference is the key operation on probabilistic models to obtain the distribution over the latent representations given data. Unfortunately, the computation of inference on complex models is extremely challenging. In spite of the success of existing inference methods, like Markov chain Monte Carlo(MCMC) and variational inference(VI), many powerful models are not available for large scale problems because inference is simply computationally intractable. The recent advances in using neural networks for probabilistic inference have shown promising results on this challenge. In this work, we propose a novel general inference framework that has the strength from both MCMC and VI. The proposed method is not only computationally scalable and efficient, but also has its root from the ergodicity theorem, that provides the guarantee of better performance with more computational power. Our experiment results suggest that our method can outperform state-of-the-art methods on generative models and Bayesian neural networks on some popular benchmark problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset