Lagrangian uncertainty quantification and information inequalities for stochastic flows

05/21/2019
by   Michal Branicki, et al.
0

We develop a systematic information-theoretic framework for quantification and mitigation of error in probabilistic Lagrangian (i.e., trajectory-based) predictions which are obtained from (Eulerian) vector fields generating the underlying dynamical system in a way which naturally applies in both deterministic and stochastic settings. This work is motivated by the desire to improve Lagrangian predictions in complex, multi-scale systems based on simplified, data-driven models. Here, discrepancies between probability measures μ and ν associated with the true dynamics and its approximation are quantified via so-called φ-divergencies, D_φ(μν), which are premetrics defined by a class of strictly convex functions φ. We derive general information bounds on the uncertainty in estimates, E^ν[f], of `true' observables E^μ[f] in terms of φ-divergencies; we then derive two distinct bounds on D_φ(μν) itself. First, an analytically tractable bound on D_φ(μν) is derived from differences between vector fields generating the true dynamics and its approximations. The second bound on D_φ(μν) is based on a difference of so-called finite-time divergence rate (FTDR) fields and it can be exploited within a computational framework to mitigate the error in Lagrangian predictions by tuning the fields of expansion rates obtained from simplified models. This new framework provides a systematic link between Eulerian (field-based) model error and the resulting uncertainty in Lagrangian (trajectory-based) predictions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro