DeepAI AI Chat
Log In Sign Up

Scalable optimization-based sampling on function space

by   Zheng Wang, et al.
Monash University
The University of Montana

Optimization-based samplers provide an efficient and parallellizable approach to solving large-scale Bayesian inverse problems. These methods solve randomly perturbed optimization problems to draw samples from an approximate posterior distribution. "Correcting" these samples, either by Metropolization or importance sampling, enables characterization of the original posterior distribution. This paper presents a new geometric interpretation of the randomize-then-optimize (RTO) method [1] and a unified transport-map interpretation of RTO and other optimization-based samplers, i.e., implicit sampling [19] and randomized-maximum-likelihood [20]. We then introduce a new subspace acceleration strategy that makes the computational complexity of RTO scale linearly with the parameter dimension. This subspace perspective suggests a natural extension of RTO to a function space setting. We thus formalize a function-space version of RTO and establish sufficient conditions for it to produce a valid Metropolis-Hastings proposal, yielding dimension-independent sampling performance. Numerical examples corroborate the dimension-independence of RTO and demonstrate sampling performance that is also robust to small observational noise.


page 1

page 2

page 3

page 4


Certified Dimension Reduction for Bayesian Updating with the Cross-Entropy Method

In inverse problems, the parameters of a model are estimated based on ob...

Computational Separations between Sampling and Optimization

Two commonly arising computational tasks in Bayesian learning are Optimi...

Data-Free Likelihood-Informed Dimension Reduction of Bayesian Inverse Problems

Identifying a low-dimensional informed parameter subspace offers a viabl...

Ensemble-based implicit sampling for Bayesian inverse problems with non-Gaussian priors

In the paper, we develop an ensemble-based implicit sampling method for ...

Consensus Based Sampling

We propose a novel method for sampling and optimization tasks based on a...

Bayesian Lasso Posterior Sampling via Parallelized Measure Transport

It is well known that the Lasso can be interpreted as a Bayesian posteri...

Incremental Mixture Importance Sampling with Shotgun optimization

This paper proposes a general optimization strategy, which combines resu...