Joint Distributions for TensorFlow Probability

01/22/2020
by   Dan Piponi, et al.
0

A central tenet of probabilistic programming is that a model is specified exactly once in a canonical representation which is usable by inference algorithms. We describe JointDistributions, a family of declarative representations of directed graphical models in TensorFlow Probability.

READ FULL TEXT
research
01/17/2013

On Graphical Models via Univariate Exponential Family Distributions

Undirected graphical models, or Markov networks, are a popular class of ...
research
11/28/2017

TensorFlow Distributions

The TensorFlow Distributions library implements a vision of probability ...
research
02/14/2023

Joint Probability Trees

We introduce Joint Probability Trees (JPT), a novel approach that makes ...
research
01/13/2017

Deep Probabilistic Programming

We propose Edward, a Turing-complete probabilistic programming language....
research
08/17/2015

Variable Elimination in the Fourier Domain

The ability to represent complex high dimensional probability distributi...
research
11/16/2021

A first approach to closeness distributions

Probabilistic graphical models allow us to encode a large probability di...
research
03/12/2021

TensorGP – Genetic Programming Engine in TensorFlow

In this paper, we resort to the TensorFlow framework to investigate the ...

Please sign up or login with your details

Forgot password? Click here to reset