MUSE: Marginal Unbiased Score Expansion and Application to CMB Lensing

12/17/2021
by   Marius Millea, et al.
0

We present the marginal unbiased score expansion (MUSE) method, an algorithm for generic high-dimensional hierarchical Bayesian inference. MUSE performs approximate marginalization over arbitrary non-Gaussian latent parameter spaces, yielding Gaussianized asymptotically unbiased and near-optimal constraints on global parameters of interest. It is computationally much cheaper than exact alternatives like Hamiltonian Monte Carlo (HMC), excelling on funnel problems which challenge HMC, and does not require any problem-specific user supervision like other approximate methods such as Variational Inference or many Simulation-Based Inference methods. MUSE makes possible the first joint Bayesian estimation of the delensed Cosmic Microwave Background (CMB) power spectrum and gravitational lensing potential power spectrum, demonstrated here on a simulated data set as large as the upcoming South Pole Telescope 3G 1500 deg^2 survey, corresponding to a latent dimensionality of ∼ 6 million and of order 100 global bandpower parameters. On a subset of the problem where an exact but more expensive HMC solution is feasible, we verify that MUSE yields nearly optimal results. We also demonstrate that existing spectrum-based forecasting tools which ignore pixel-masking underestimate predicted error bars by only ∼ 10%. This method is a promising path forward for fast lensing and delensing analyses which will be necessary for future CMB experiments such as SPT-3G, Simons Observatory, or CMB-S4, and can complement or supersede existing HMC approaches. The success of MUSE on this challenging problem strengthens its case as a generic procedure for a broad class of high-dimensional inference problems.

READ FULL TEXT

page 8

page 15

page 16

research
09/21/2022

Improved Marginal Unbiased Score Expansion (MUSE) via Implicit Differentiation

We apply the technique of implicit differentiation to boost performance,...
research
07/27/2019

Max-and-Smooth: a two-step approach for approximate Bayesian inference in latent Gaussian models

With modern high-dimensional data, complex statistical models are necess...
research
12/06/2021

Joint Posterior Inference for Latent Gaussian Models with R-INLA

Efficient Bayesian inference remains a computational challenge in hierar...
research
12/22/2021

Surrogate Likelihoods for Variational Annealed Importance Sampling

Variational inference is a powerful paradigm for approximate Bayesian in...
research
10/01/2021

Arbitrary Marginal Neural Ratio Estimation for Simulation-based Inference

In many areas of science, complex phenomena are modeled by stochastic pa...
research
12/07/2017

Exact active subspace Metropolis-Hastings, with applications to the Lorenz-96 system

We consider the application of active subspaces to inform a Metropolis-H...

Please sign up or login with your details

Forgot password? Click here to reset