Learning Theory of Distribution Regression with Neural Networks

by   Zhongjie Shi, et al.

In this paper, we aim at establishing an approximation theory and a learning theory of distribution regression via a fully connected neural network (FNN). In contrast to the classical regression methods, the input variables of distribution regression are probability measures. Then we often need to perform a second-stage sampling process to approximate the actual information of the distribution. On the other hand, the classical neural network structure requires the input variable to be a vector. When the input samples are probability distributions, the traditional deep neural network method cannot be directly used and the difficulty arises for distribution regression. A well-defined neural network structure for distribution inputs is intensively desirable. There is no mathematical model and theoretical analysis on neural network realization of distribution regression. To overcome technical difficulties and address this issue, we establish a novel fully connected neural network framework to realize an approximation theory of functionals defined on the space of Borel probability measures. Furthermore, based on the established functional approximation results, in the hypothesis space induced by the novel FNN structure with distribution inputs, almost optimal learning rates for the proposed distribution regression model up to logarithmic terms are derived via a novel two-stage error decomposition technique.


page 1

page 2

page 3

page 4


On the rate of convergence of fully connected very deep neural network regression estimates

Recent results in nonparametric regression show that deep learning, i.e....

Estimates on Learning Rates for Multi-Penalty Distribution Regression

This paper is concerned with functional learning by utilizing two-stage ...

A two-stage hybrid model by using artificial neural networks as feature construction algorithms

We propose a two-stage hybrid approach with neural networks as the new f...

Indeterminate Probability Neural Network

We propose a new general model called IPNN - Indeterminate Probability N...

pseudo-Bayesian Neural Networks for detecting Out of Distribution Inputs

Conventional Bayesian Neural Networks (BNNs) are known to be capable of ...

Densely connected neural networks for nonlinear regression

Densely connected convolutional networks (DenseNet) behave well in image...

Efficient hinging hyperplanes neural network and its application in nonlinear system identification

In this paper, the efficient hinging hyperplanes (EHH) neural network is...

Please sign up or login with your details

Forgot password? Click here to reset