Foundation Posteriors for Approximate Probabilistic Inference

05/19/2022
by   Mike Wu, et al.
7

Probabilistic programs provide an expressive representation language for generative models. Given a probabilistic program, we are interested in the task of posterior inference: estimating a latent variable given a set of observed variables. Existing techniques for inference in probabilistic programs often require choosing many hyper-parameters, are computationally expensive, and/or only work for restricted classes of programs. Here we formulate inference as masked language modeling: given a program, we generate a supervised dataset of variables and assignments, and randomly mask a subset of the assignments. We then train a neural network to unmask the random values, defining an approximate posterior distribution. By optimizing a single neural network across a range of programs we amortize the cost of training, yielding a “foundation” posterior able to do zero-shot inference for new programs. The foundation posterior can also be fine-tuned for a particular program and dataset by optimizing a variational inference objective. We show the efficacy of the approach, zero-shot and fine-tuned, on a benchmark of STAN programs.

READ FULL TEXT

page 2

page 5

page 7

page 15

research
03/01/2021

Meta-Learning an Inference Algorithm for Probabilistic Programs

We present a meta-algorithm for learning a posterior-inference algorithm...
research
10/18/2016

Deep Amortized Inference for Probabilistic Programs

Probabilistic programming languages (PPLs) are a powerful modeling tool,...
research
10/25/2019

Attention for Inference Compilation

We present a new approach to automatic amortized inference in universal ...
research
05/08/2021

How To Train Your Program

We present a Bayesian approach to machine learning with probabilistic pr...
research
07/10/2020

Self-Reflective Variational Autoencoder

The Variational Autoencoder (VAE) is a powerful framework for learning p...
research
10/16/2019

Universal Marginaliser for Deep Amortised Inference for Probabilistic Programs

Probabilistic programming languages (PPLs) are powerful modelling tools ...
research
11/04/2020

A deep learning classifier for local ancestry inference

Local ancestry inference (LAI) identifies the ancestry of each segment o...

Please sign up or login with your details

Forgot password? Click here to reset