Convergence Rates of Gaussian ODE Filters

07/25/2018 ∙ by Hans Kersting, et al. ∙ 0

A recently-introduced class of probabilistic (uncertainty-aware) solvers for ordinary differential equations (ODEs) applies Gaussian (Kalman) filtering to initial value problems. These methods model the true solution x and its first q derivatives a priori as a Gauss--Markov process X, which is then iteratively conditioned on information about ẋ. We prove worst-case local convergence rates of order h^q+1 for a wide range of versions of this Gaussian ODE filter, as well as global convergence rates of order h^q in the case of q=1 and an integrated Brownian motion prior, and analyze how inaccurate information on ẋ coming from approximate evaluations of f affects these rates. Moreover, we present explicit formulas for the steady states and show that the posterior confidence intervals are well calibrated in all considered cases that exhibit global convergence---in the sense that they globally contract at the same rate as the truncation error.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.