Log In Sign Up

TensorFlow Distributions

by   Joshua V. Dillon, et al.

The TensorFlow Distributions library implements a vision of probability theory adapted to the modern deep-learning paradigm of end-to-end differentiable computation. Building on two basic abstractions, it offers flexible building blocks for probabilistic computation. Distributions provide fast, numerically stable methods for generating samples and computing statistics, e.g., log density. Bijectors provide composable volume-tracking transformations with automatic caching. Together these enable modular construction of high dimensional distributions and transformations not possible with previous libraries (e.g., pixelCNNs, autoregressive flows, and reversible residual networks). They are the workhorse behind deep probabilistic programming systems like Edward and empower fast black-box inference in probabilistic models built on deep-network components. TensorFlow Distributions has proven an important part of the TensorFlow toolkit within Google and in the broader deep learning community.


page 1

page 2

page 3

page 4


Joint Distributions for TensorFlow Probability

A central tenet of probabilistic programming is that a model is specifie...

Pyro: Deep Universal Probabilistic Programming

Pyro is a probabilistic programming language built on Python as a platfo...

Graph Neural Networks in TensorFlow and Keras with Spektral

In this paper we present Spektral, an open-source Python library for bui...

Deep Probabilistic Programming

We propose Edward, a Turing-complete probabilistic programming language....

TF-Coder: Program Synthesis for Tensor Manipulations

The success and popularity of deep learning is on the rise, partially du...

TensorFlow Agents: Efficient Batched Reinforcement Learning in TensorFlow

We introduce TensorFlow Agents, an efficient infrastructure paradigm for...

FlexServe: Deployment of PyTorch Models as Flexible REST Endpoints

The integration of artificial intelligence capabilities into modern soft...