Approximation of Functionals by Neural Network without Curse of Dimensionality

05/28/2022
by   Yahong Yang, et al.
0

In this paper, we establish a neural network to approximate functionals, which are maps from infinite dimensional spaces to finite dimensional spaces. The approximation error of the neural network is O(1/√(m)) where m is the size of networks, which overcomes the curse of dimensionality. The key idea of the approximation is to define a Barron spectral space of functionals.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/07/2020

Model Reduction and Neural Networks for Parametric PDEs

We develop a general framework for data-driven approximation of input-ou...
research
12/14/2021

Simplicial approximation to CW complexes in practice

We describe an algorithm that takes as an input a CW complex and returns...
research
10/03/2019

On Universal Approximation by Neural Networks with Uniform Guarantees on Approximation of Infinite Dimensional Maps

The study of universal approximation of arbitrary functions f: X→Y by ne...
research
06/18/2019

Barron Spaces and the Compositional Function Spaces for Neural Network Models

One of the key issues in the analysis of machine learning models is to i...
research
06/29/2017

Music Signal Processing Using Vector Product Neural Networks

We propose a novel neural network model for music signal processing usin...
research
05/26/2023

Universal Approximation and the Topological Neural Network

A topological neural network (TNN), which takes data from a Tychonoff to...
research
05/11/2021

Improved Approximate Rips Filtrations with Shifted Integer Lattices and Cubical Complexes

Rips complexes are important structures for analyzing topological featur...

Please sign up or login with your details

Forgot password? Click here to reset