Differentiable Factor Graph Optimization for Learning Smoothers

05/18/2021
by   Brent Yi, et al.
0

A recent line of work has shown that end-to-end optimization of Bayesian filters can be used to learn state estimators for systems whose underlying models are difficult to hand-design or tune, while retaining the core advantages of probabilistic state estimation. As an alternative approach for state estimation in these settings, we present an end-to-end approach for learning state estimators modeled as factor graph-based smoothers. By unrolling the optimizer we use for maximum a posteriori inference in these probabilistic graphical models, this method is able to learn probabilistic system models in the full context of an overall state estimator, while also taking advantage of the distinct accuracy and runtime advantages that smoothers offer over recursive filters. We study our approach using two fundamental state estimation problems, object tracking and visual odometry, where we demonstrate a significant improvement over existing baselines. Our work comes with an extensive code release, which includes the evaluated models and libraries for differentiable Lie theory and factor graph optimization: https://sites.google.com/view/diffsmoothing/

READ FULL TEXT

page 1

page 7

research
10/25/2020

Multimodal Sensor Fusion with Differentiable Filters

Leveraging multimodal information with recursive Bayesian filters improv...
research
05/28/2018

Differentiable Particle Filters: End-to-End Learning with Algorithmic Priors

We present differentiable particle filters (DPFs): a differentiable impl...
research
08/19/2023

Enhancing State Estimation in Robots: A Data-Driven Approach with Differentiable Ensemble Kalman Filters

This paper introduces a novel state estimation framework for robots usin...
research
08/04/2021

LEO: Learning Energy-based Models in Graph Optimization

We address the problem of learning observation models end-to-end for est...
research
03/08/2023

DNBP: Differentiable Nonparametric Belief Propagation

We present a differentiable approach to learn the probabilistic factors ...
research
12/28/2020

How to Train Your Differentiable Filter

In many robotic applications, it is crucial to maintain a belief about t...
research
07/19/2022

Theseus: A Library for Differentiable Nonlinear Optimization

We present Theseus, an efficient application-agnostic open source librar...

Please sign up or login with your details

Forgot password? Click here to reset