Posterior-Aided Regularization for Likelihood-Free Inference

by   Dongjun Kim, et al.

The recent development of likelihood-free inference aims training a flexible density estimator for the target posterior with a set of input-output pairs from simulation. Given the diversity of simulation structures, it is difficult to find a single unified inference method for each simulation model. This paper proposes a universally applicable regularization technique, called Posterior-Aided Regularization (PAR), which is applicable to learning the density estimator, regardless of the model structure. Particularly, PAR solves the mode collapse problem that arises as the output dimension of the simulation increases. PAR resolves this posterior mode degeneracy through a mixture of 1) the reverse KL divergence with the mode seeking property; and 2) the mutual information for the high quality representation on likelihood. Because of the estimation intractability of PAR, we provide a unified estimation method of PAR to estimate both reverse KL term and mutual information term with a single neural network. Afterwards, we theoretically prove the asymptotic convergence of the regularized optimal solution to the unregularized optimal solution as the regularization magnitude converges to zero. Additionally, we empirically show that past sequential neural likelihood inferences in conjunction with PAR present the statistically significant gains on diverse simulation tasks.



There are no comments yet.


page 1

page 17

page 20

page 22


Sequential Likelihood-Free Inference with Implicit Surrogate Proposal

Bayesian inference without the access of likelihood, called likelihood-f...

MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood Inference from Sampled Trajectories

Simulation-based inference enables learning the parameters of a model ev...

Estimating Kullback-Leibler Divergence Using Kernel Machines

Recently, a method called the Mutual Information Neural Estimator (MINE)...

Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows

We present Sequential Neural Likelihood (SNL), a new method for Bayesian...

Analytic-DPM: an Analytic Estimate of the Optimal Reverse Variance in Diffusion Probabilistic Models

Diffusion probabilistic models (DPMs) represent a class of powerful gene...

Causal Order Identification to Address Confounding: Binary Variables

This paper considers an extension of the linear non-Gaussian acyclic mod...

ITENE: Intrinsic Transfer Entropy Neural Estimator

Quantifying the directionality of information flow is instrumental in un...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.