Learning Probabilistic Programs Using Backpropagation

05/15/2017
by   Avi Pfeffer, et al.
0

Probabilistic modeling enables combining domain knowledge with learning from data, thereby supporting learning from fewer training instances than purely data-driven methods. However, learning probabilistic models is difficult and has not achieved the level of performance of methods such as deep neural networks on many tasks. In this paper, we attempt to address this issue by presenting a method for learning the parameters of a probabilistic program using backpropagation. Our approach opens the possibility to building deep probabilistic programming models that are trained in a similar way to neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/31/2021

Probabilistic Deep Learning with Probabilistic Neural Networks and Deep Probabilistic Models

Probabilistic deep learning is deep learning that accounts for uncertain...
research
09/30/2018

Extending Stan for Deep Probabilistic Programming

Deep probabilistic programming combines deep neural networks (for automa...
research
02/18/2015

Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks

Large multilayer neural networks trained with backpropagation have recen...
research
08/29/2019

InferPy: Probabilistic Modeling with Deep Neural Networks Made Easy

InferPy is a Python package for probabilistic modeling with deep neural ...
research
11/16/2015

MuProp: Unbiased Backpropagation for Stochastic Neural Networks

Deep neural networks are powerful parametric models that can be trained ...
research
04/05/2023

Multi-annotator Deep Learning: A Probabilistic Framework for Classification

Solving complex classification tasks using deep neural networks typicall...
research
09/09/2014

Deep Unfolding: Model-Based Inspiration of Novel Deep Architectures

Model-based methods and deep neural networks have both been tremendously...

Please sign up or login with your details

Forgot password? Click here to reset