Variational Inference for Deblending Crowded Starfields

02/04/2021
by   Runjing Liu, et al.
5

In the image data collected by astronomical surveys, stars and galaxies often overlap. Deblending is the task of distinguishing and characterizing individual light sources from survey images. We propose StarNet, a fully Bayesian method to deblend sources in astronomical images of crowded star fields. StarNet leverages recent advances in variational inference, including amortized variational distributions and the wake-sleep algorithm. Wake-sleep, which minimizes forward KL divergence, has significant benefits compared to traditional variational inference, which minimizes a reverse KL divergence. In our experiments with SDSS images of the M2 globular cluster, StarNet is substantially more accurate than two competing methods: Probablistic Cataloging (PCAT), a method that uses MCMC for inference, and a software pipeline employed by SDSS for deblending (DAOPHOT). In addition, StarNet is as much as 100,000 times faster than PCAT, exhibiting the scaling characteristics necessary to perform fully Bayesian inference on modern astronomical surveys.

READ FULL TEXT

page 9

page 18

page 19

page 20

page 25

page 26

page 33

page 37

research
07/12/2022

Scalable Bayesian Inference for Detection and Deblending in Astronomical Images

We present a new probabilistic method for detecting, deblending, and cat...
research
05/29/2018

Forward Amortized Inference for Likelihood-Free Variational Marginalization

In this paper, we introduce a new form of amortized variational inferenc...
research
02/03/2022

Transport Score Climbing: Variational Inference Using Forward KL and Adaptive Neural Transport

Variational inference often minimizes the "reverse" Kullbeck-Leibler (KL...
research
03/23/2020

Markovian Score Climbing: Variational Inference with KL(p||q)

Modern variational inference (VI) uses stochastic gradients to avoid int...
research
05/14/2020

Variational Inference as Iterative Projection in a Bayesian Hilbert Space

Variational Bayesian inference is an important machine-learning tool tha...
research
07/17/2022

Gradients should stay on Path: Better Estimators of the Reverse- and Forward KL Divergence for Normalizing Flows

We propose an algorithm to estimate the path-gradient of both the revers...
research
10/29/2018

Variational Inference with Tail-adaptive f-Divergence

Variational inference with α-divergences has been widely used in modern ...

Please sign up or login with your details

Forgot password? Click here to reset