How To Train Your Program

05/08/2021
by   David Tolpin, et al.
17

We present a Bayesian approach to machine learning with probabilistic programs. In our approach, training on available data is implemented as inference on a hierarchical model. The posterior distribution of model parameters is then used to stochastically condition a complementary model, such that inference on new data yields the same posterior distribution of latent parameters corresponding to the new data as inference on a hierachical model on the combination of both previously available and new data, at a lower computation cost. We frame the approach as a design pattern of probabilistic programming referred to herein as `stump and fungus', and illustrate realization of the pattern on a didactic case study.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/25/2019

Attention for Inference Compilation

We present a new approach to automatic amortized inference in universal ...
research
02/01/2020

Online Bayesian phylodynamic inference in BEAST with application to epidemic reconstruction

Reconstructing pathogen dynamics from genetic data as they become availa...
research
12/19/2013

Detecting Parameter Symmetries in Probabilistic Models

Probabilistic models often have parameters that can be translated, scale...
research
07/26/2023

Identifiability and Falsifiability: Two Challenges for Bayesian Model Expansion

We study the identifiability of model parameters and falsifiability of m...
research
02/20/2018

Bayesian Incremental Learning for Deep Neural Networks

In industrial machine learning pipelines, data often arrive in parts. Pa...
research
07/28/2018

Making Recursive Bayesian Inference Accessible

Bayesian models are naturally equipped to provide recursive inference be...
research
05/19/2022

Foundation Posteriors for Approximate Probabilistic Inference

Probabilistic programs provide an expressive representation language for...

Please sign up or login with your details

Forgot password? Click here to reset