Attention for Inference Compilation

10/25/2019
by   William Harvey, et al.
21

We present a new approach to automatic amortized inference in universal probabilistic programs which improves performance compared to current methods. Our approach is a variation of inference compilation (IC) which leverages deep neural networks to approximate a posterior distribution over latent variables in a probabilistic program. A challenge with existing IC network architectures is that they can fail to model long-range dependencies between latent variables. To address this, we introduce an attention mechanism that attends to the most salient variables previously sampled in the execution of a probabilistic program. We demonstrate that the addition of attention allows the proposal distributions to better match the true posterior, enhancing inference about latent variables in simulators.

READ FULL TEXT
research
12/04/2019

Learning Deep Generative Models with Short Run Inference Dynamics

This paper studies the fundamental problem of learning deep generative m...
research
05/08/2021

How To Train Your Program

We present a Bayesian approach to machine learning with probabilistic pr...
research
10/01/2018

Probabilistic Meta-Representations Of Neural Networks

Existing Bayesian treatments of neural networks are typically characteri...
research
06/13/2016

Robust Probabilistic Modeling with Bayesian Data Reweighting

Probabilistic models analyze data by relying on a set of assumptions. Da...
research
05/19/2022

Foundation Posteriors for Approximate Probabilistic Inference

Probabilistic programs provide an expressive representation language for...
research
04/23/2022

Learning and Inference in Sparse Coding Models with Langevin Dynamics

We describe a stochastic, dynamical system capable of inference and lear...
research
12/19/2013

Detecting Parameter Symmetries in Probabilistic Models

Probabilistic models often have parameters that can be translated, scale...

Please sign up or login with your details

Forgot password? Click here to reset