
Autoregressive Energy Machines
Neural density estimators are flexible families of parametric models which have seen widespread use in unsupervised machine learning in recent years. Maximumlikelihood training typically dictates that these models be constrained to specify an explicit density. However, this limitation can be overcome by instead using a neural network to specify an energy function, or unnormalized density, which can subsequently be normalized to obtain a valid distribution. The challenge with this approach lies in accurately estimating the normalizing constant of the highdimensional energy function. We propose the Autoregressive Energy Machine, an energybased model which simultaneously learns an unnormalized density and computes an importancesampling estimate of the normalizing constant for each conditional in an autoregressive decomposition. The Autoregressive Energy Machine achieves stateoftheart performance on a suite of densityestimation tasks.
04/11/2019 ∙ by Charlie Nash, et al. ∙ 42 ∙ shareread it

Sequential Neural Methods for Likelihoodfree Inference
Likelihoodfree inference refers to inference when a likelihood function cannot be explicitly evaluated, which is often the case for models based on simulators. Most of the literature is based on samplebased `Approximate Bayesian Computation' methods, but recent work suggests that approaches based on deep neural conditional density estimators can obtain stateoftheart results with fewer simulations. The neural approaches vary in how they choose which simulations to run and what they learn: an approximate posterior or a surrogate likelihood. This work provides some direct controlled comparisons between these choices.
11/21/2018 ∙ by Conor Durkan, et al. ∙ 40 ∙ shareread it

Neural Spline Flows
A normalizing flow models a complex probability density as an invertible transformation of a simple base density. Flows based on either coupling or autoregressive transforms both offer exact density evaluation and sampling, but rely on the parameterization of an easily invertible elementwise transformation, whose choice determines the flexibility of these models. Building upon recent work, we propose a fullydifferentiable module based on monotonic rationalquadratic splines, which enhances the flexibility of both coupling and autoregressive transforms while retaining analytic invertibility. We demonstrate that neural spline flows improve density estimation, variational inference, and generative modeling of images.
06/10/2019 ∙ by Conor Durkan, et al. ∙ 5 ∙ shareread it

CubicSpline Flows
A normalizing flow models a complex probability density as an invertible transformation of a simple density. The invertibility means that we can evaluate densities and generate samples from a flow. In practice, autoregressive flowbased models are slow to invert, making either density estimation or sample generation slow. Flows based on coupling transforms are fast for both tasks, but have previously performed less well at density estimation than autoregressive flows. We stack a new coupling transform, based on monotonic cubic splines, with LUdecomposed linear layers. The resulting cubicspline flow retains an exact onepass inverse, can be used to generate highquality images, and closes the gap with autoregressive flows on a suite of densityestimation tasks.
06/05/2019 ∙ by Conor Durkan, et al. ∙ 1 ∙ shareread it
Conor Durkan
is this you? claim profile