Laplace Matching for fast Approximate Inference in Generalized Linear Models

05/07/2021
by   Marius Hobbhahn, et al.
0

Bayesian inference in generalized linear models (GLMs), i.e. Gaussian regression with non-Gaussian likelihoods, is generally non-analytic and requires computationally expensive approximations, such as sampling or variational inference. We propose an approximate inference framework primarily designed to be computationally cheap while still achieving high approximation quality. The concept, which we call Laplace Matching, involves closed-form, approximate, bi-directional transformations between the parameter spaces of exponential families. These are constructed from Laplace approximations under custom-designed basis transformations. The mappings can then be leveraged to effectively turn a latent Gaussian distribution into a conjugate prior for a rich class of observable variables. This effectively turns inference in GLMs into conjugate inference (with small approximation errors). We empirically evaluate the method in two different GLMs, showing approximation quality comparable to state-of-the-art approximate inference techniques at a drastic reduction in computational cost. More specifically, our method has a cost comparable to the very first step of the iterative optimization usually employed in standard GLM inference.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset