Learning Theory of Distribution Regression with Neural Networks

07/07/2023
by   Zhongjie Shi, et al.
0

In this paper, we aim at establishing an approximation theory and a learning theory of distribution regression via a fully connected neural network (FNN). In contrast to the classical regression methods, the input variables of distribution regression are probability measures. Then we often need to perform a second-stage sampling process to approximate the actual information of the distribution. On the other hand, the classical neural network structure requires the input variable to be a vector. When the input samples are probability distributions, the traditional deep neural network method cannot be directly used and the difficulty arises for distribution regression. A well-defined neural network structure for distribution inputs is intensively desirable. There is no mathematical model and theoretical analysis on neural network realization of distribution regression. To overcome technical difficulties and address this issue, we establish a novel fully connected neural network framework to realize an approximation theory of functionals defined on the space of Borel probability measures. Furthermore, based on the established functional approximation results, in the hypothesis space induced by the novel FNN structure with distribution inputs, almost optimal learning rates for the proposed distribution regression model up to logarithmic terms are derived via a novel two-stage error decomposition technique.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/29/2019

On the rate of convergence of fully connected very deep neural network regression estimates

Recent results in nonparametric regression show that deep learning, i.e....
research
06/16/2020

Estimates on Learning Rates for Multi-Penalty Distribution Regression

This paper is concerned with functional learning by utilizing two-stage ...
research
12/06/2018

A two-stage hybrid model by using artificial neural networks as feature construction algorithms

We propose a two-stage hybrid approach with neural networks as the new f...
research
03/21/2023

Indeterminate Probability Neural Network

We propose a new general model called IPNN - Indeterminate Probability N...
research
02/02/2021

pseudo-Bayesian Neural Networks for detecting Out of Distribution Inputs

Conventional Bayesian Neural Networks (BNNs) are known to be capable of ...
research
07/29/2021

Densely connected neural networks for nonlinear regression

Densely connected convolutional networks (DenseNet) behave well in image...
research
05/15/2019

Efficient hinging hyperplanes neural network and its application in nonlinear system identification

In this paper, the efficient hinging hyperplanes (EHH) neural network is...

Please sign up or login with your details

Forgot password? Click here to reset