Log In Sign Up

Compressed Monte Carlo with application in particle filtering

by   Luca Martino, et al.

Bayesian models have become very popular over the last years in several fields such as signal processing, statistics, and machine learning. Bayesian inference requires the approximation of complicated integrals involving posterior distributions. For this purpose, Monte Carlo (MC) methods, such as Markov Chain Monte Carlo and importance sampling algorithms, are often employed. In this work, we introduce the theory and practice of a Compressed MC (C-MC) scheme to compress the statistical information contained in a set of random samples. In its basic version, C-MC is strictly related to the stratification technique, a well-known method used for variance reduction purposes. Deterministic C-MC schemes are also presented, which provide very good performance. The compression problem is strictly related to the moment matching approach applied in different filtering techniques, usually called as Gaussian quadrature rules or sigma-point methods. C-MC can be employed in a distributed Bayesian inference framework when cheap and fast communications with a central processor are required. Furthermore, C-MC is useful within particle filtering and adaptive IS algorithms, as shown by three novel schemes introduced in this work. Six numerical results confirm the benefits of the introduced schemes, outperforming the corresponding benchmark methods. A related code is also provided.


Compressed particle methods for expensive models with application in Astronomy and Remote Sensing

In many inference problems, the evaluation of complex and costly models ...

Approximate Shannon Sampling in Importance Sampling: Nearly Consistent Finite Particle Estimates

In Bayesian inference, we seek to compute information about random varia...

Group Importance Sampling for Particle Filtering and MCMC

Importance Sampling (IS) is a well-known Monte Carlo technique that appr...

Structured Monte Carlo Sampling for Nonisotropic Distributions via Determinantal Point Processes

We propose a new class of structured methods for Monte Carlo (MC) sampli...

Why Simple Quadrature is just as good as Monte Carlo

We motive and calculate Newton-Cotes quadrature integration variance and...

MC-CIM: Compute-in-Memory with Monte-Carlo Dropouts for Bayesian Edge Intelligence

We propose MC-CIM, a compute-in-memory (CIM) framework for robust, yet l...

Monte-Carlo Methods for the Neutron Transport Equation

This paper continues our treatment of the Neutron Transport Equation (NT...