Inference in conditioned dynamics through causality restoration

10/18/2022
by   Alfredo Braunstein, et al.
0

Computing observables from conditioned dynamics is typically computationally hard, because, although obtaining independent samples efficiently from the unconditioned dynamics is usually feasible, generally most of the samples must be discarded (in a form of importance sampling) because they do not satisfy the imposed conditions. Sampling directly from the conditioned distribution is non-trivial, as conditioning breaks the causal properties of the dynamics which ultimately renders the sampling procedure efficient. One standard way of achieving it is through a Metropolis Monte-Carlo procedure, but this procedure is normally slow and a very large number of Monte-Carlo steps is needed to obtain a small number of statistically independent samples. In this work, we propose an alternative method to produce independent samples from a conditioned distribution. The method learns the parameters of a generalized dynamical model that optimally describe the conditioned distribution in a variational sense. The outcome is an effective, unconditioned, dynamical model, from which one can trivially obtain independent samples, effectively restoring causality of the conditioned distribution. The consequences are twofold: on the one hand, it allows us to efficiently compute observables from the conditioned dynamics by simply averaging over independent samples. On the other hand, the method gives an effective unconditioned distribution which is easier to interpret. The method is flexible and can be applied virtually to any dynamics. We discuss an important application of the method, namely the problem of epidemic risk assessment from (imperfect) clinical tests, for a large family of time-continuous epidemic models endowed with a Gillespie-like sampler. We show that the method compares favorably against the state of the art, including the soft-margin approach and mean-field methods.

READ FULL TEXT
research
09/30/2022

Multicanonical Sequential Monte Carlo Sampler for Uncertainty Quantification

In many real-world engineering systems, the performance or reliability o...
research
01/26/2022

Uphill Roads to Variational Tightness: Monotonicity and Monte Carlo Objectives

We revisit the theory of importance weighted variational inference (IWVI...
research
04/03/2019

Monte Carlo algorithms are very effective in finding the largest independent set in sparse random graphs

The effectiveness of stochastic algorithms based on Monte Carlo dynamics...
research
06/02/2022

Learning a Restricted Boltzmann Machine using biased Monte Carlo sampling

Restricted Boltzmann Machines are simple and powerful generative models ...
research
06/22/2018

Tensor Monte Carlo: particle methods for the GPU era

Multi-sample objectives improve over single-sample estimates by giving t...
research
05/24/2018

Strategic Monte Carlo Methods for State and Parameter Estimation in High Dimensional Nonlinear Problems

In statistical data assimilation one seeks the largest maximum of the co...

Please sign up or login with your details

Forgot password? Click here to reset