Log In Sign Up

Differentiable Nonparametric Belief Propagation

by   Anthony Opipari, et al.

We present a differentiable approach to learn the probabilistic factors used for inference by a nonparametric belief propagation algorithm. Existing nonparametric belief propagation methods rely on domain-specific features encoded in the probabilistic factors of a graphical model. In this work, we replace each crafted factor with a differentiable neural network enabling the factors to be learned using an efficient optimization routine from labeled data. By combining differentiable neural networks with an efficient belief propagation algorithm, our method learns to maintain a set of marginal posterior samples using end-to-end training. We evaluate our differentiable nonparametric belief propagation (DNBP) method on a set of articulated pose tracking tasks and compare performance with a recurrent neural network. Results from this comparison demonstrate the effectiveness of using learned factors for tracking and suggest the practical advantage over hand-crafted approaches. The project webpage is available at:


page 1

page 8

page 9


A Simple Insight into Iterative Belief Propagation's Success

In Non - ergodic belief networks the posterior belief OF many queries gi...

Factored Probabilistic Belief Tracking

The problem of belief tracking in the presence of stochastic actions and...

Belief Propagation Neural Networks

Learned neural solvers have successfully been used to solve combinatoria...

Neural Enhanced Belief Propagation on Factor Graphs

A graphical model is a structured representation of locally dependent ra...

Multi-domain Dialog State Tracking using Recurrent Neural Networks

Dialog state tracking is a key component of many modern dialog systems, ...

How to Train Your Differentiable Filter

In many robotic applications, it is crucial to maintain a belief about t...

k-meansNet: When k-means Meets Differentiable Programming

In this paper, we study how to make clustering benefiting from different...