Weakest Preexpectation Semantics for Bayesian Inference

05/18/2020
by   Marcin Szymczak, et al.
0

We present a semantics of a probabilistic while-language with soft conditioning and continuous distributions which handles programs diverging with positive probability. To this end, we extend the probabilistic guarded command language (pGCL) with draws from continuous distributions and a score operator. The main contribution is an extension of the standard weakest preexpectation semantics to support these constructs. As a sanity check of our semantics, we define an alternative trace-based semantics of the language, and show that the two semantics are equivalent. Various examples illustrate the applicability of the semantics.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

01/17/2020

Generative Datalog with Continuous Distributions

Arguing for the need to combine declarative and probabilistic programmin...
01/27/2021

Compositional Semantics for Probabilistic Programs with Exact Conditioning

We define a probabilistic programming language for Gaussian random varia...
06/20/2018

An Application of Computable Distributions to the Semantics of Probabilistic Programs

In this chapter, we explore how (Type-2) computable distributions can be...
02/25/2018

Trace semantics via determinization for probabilistic transition systems

A coalgebraic definition of finite and infinite trace semantics for prob...
08/03/2013

Measure Transformer Semantics for Bayesian Machine Learning

The Bayesian approach to machine learning amounts to computing posterior...
09/17/2021

Fixpoint Semantics for Recursive SHACL

SHACL is a W3C-proposed language for expressing structural constraints o...
03/07/2018

Borel Kernels and their Approximation, Categorically

This paper introduces a categorical framework to study the exact and app...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.