Neural Likelihoods via Cumulative Distribution Functions

11/02/2018
by   Pawel Chilinski, et al.
6

We leverage neural networks as universal approximators of monotonic functions to build a parameterization of conditional cumulative distribution functions. By a modification of backpropagation as applied both to parameters and outputs, we show that we are able to build black box density estimators which are competitive against recently proposed models, while avoiding assumptions concerning the base distribution in a mixture model. That is, it makes no use of parametric models as building blocks. This approach removes some undesirable degrees of freedom on the design on neural networks for flexible conditional density estimation, while implementation can be easily accomplished by standard algorithms readily available in popular neural network toolboxes.

READ FULL TEXT
research
04/30/2020

A Triangular Network For Density Estimation

In this paper, triangular networks refer to feedforward neural networks ...
research
05/29/2021

Deconvolutional Density Network: Free-Form Conditional Density Estimation

Conditional density estimation is the task of estimating the probability...
research
03/03/2019

Conditional Density Estimation with Neural Networks: Best Practices and Benchmarks

Given a set of empirical observations, conditional density estimation ai...
research
12/04/2020

Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models

Mixture of experts (MoE) models are widely applied for conditional proba...
research
08/14/2019

Unconstrained Monotonic Neural Networks

Monotonic neural networks have recently been proposed as a way to define...
research
02/05/2022

Beyond Black Box Densities: Parameter Learning for the Deviated Components

As we collect additional samples from a data population for which a know...

Please sign up or login with your details

Forgot password? Click here to reset