Universal Regular Conditional Distributions

05/17/2021
by   Anastasis Kratsios, et al.
0

We introduce a general framework for approximating regular conditional distributions (RCDs). Our approximations of these RCDs are implemented by a new class of geometric deep learning models with inputs in ℝ^d and outputs in the Wasserstein-1 space 𝒫_1(ℝ^D). We find that the models built using our framework can approximate any continuous functions from ℝ^d to 𝒫_1(ℝ^D) uniformly on compacts, and quantitative rates are obtained. We identify two methods for avoiding the "curse of dimensionality"; i.e.: the number of parameters determining the approximating neural network depends only polynomially on the involved dimension and the approximation error. The first solution describes functions in C(ℝ^d,𝒫_1(ℝ^D)) which can be efficiently approximated on any compact subset of ℝ^d. Conversely, the second approach describes sets in ℝ^d, on which any function in C(ℝ^d,𝒫_1(ℝ^D)) can be efficiently approximated. Our framework is used to obtain an affirmative answer to the open conjecture of Bishop (1994); namely: mixture density networks are universal regular conditional distributions. The predictive performance of the proposed models is evaluated against comparable learning models on various probabilistic predictions tasks in the context of ELMs, model uncertainty, and heteroscedastic regression. All the results are obtained for more general input and output spaces and thus apply to geometric deep learning contexts.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/13/2021

Quantitative Rates and Fundamental Obstructions to Non-Euclidean Universal Approximation with Deep Narrow Feed-Forward Networks

By incorporating structured pairs of non-trainable input and output laye...
research
12/29/2021

Deep neural network approximation theory for high-dimensional functions

The purpose of this article is to develop machinery to study the capacit...
research
06/03/2020

Non-Euclidean Universal Approximation

Modifications to a neural network's input and output layers are often re...
research
03/01/2019

Approximation by finite mixtures of continuous density functions that vanish at infinity

Given sufficiently many components, it is often cited that finite mixtur...
research
07/28/2021

Neural Network Approximation of Refinable Functions

In the desire to quantify the success of neural networks in deep learnin...
research
01/31/2022

Metric Hypertransformers are Universal Adapted Maps

We introduce a universal class of geometric deep learning models, called...
research
02/13/2022

Local approximation of operators

Many applications, such as system identification, classification of time...

Please sign up or login with your details

Forgot password? Click here to reset