Global universal approximation of functional input maps on weighted spaces

06/05/2023
by   Christa Cuchiero, et al.
0

We introduce so-called functional input neural networks defined on a possibly infinite dimensional weighted space with values also in a possibly infinite dimensional output space. To this end, we use an additive family as hidden layer maps and a non-linear activation function applied to each hidden layer. Relying on Stone-Weierstrass theorems on weighted spaces, we can prove a global universal approximation result for generalizations of continuous functions going beyond the usual approximation on compact sets. This then applies in particular to approximation of (non-anticipative) path space functionals via functional input neural networks. As a further application of the weighted Stone-Weierstrass theorem we prove a global universal approximation result for linear functions of the signature. We also introduce the viewpoint of Gaussian process regression in this setting and show that the reproducing kernel Hilbert space of the signature kernels are Cameron-Martin spaces of certain Gaussian processes. This paves the way towards uncertainty quantification for signature kernel regression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/25/2021

A brief note on understanding neural networks as Gaussian processes

As a generalization of the work in [Lee et al., 2017], this note briefly...
research
09/02/2018

On overcoming the Curse of Dimensionality in Neural Networks

Let H be a reproducing Kernel Hilbert space. For i=1,...,N, let x_i∈R^d ...
research
03/15/2021

Representation Theorem for Matrix Product States

In this work, we investigate the universal representation capacity of th...
research
06/02/2021

Transformers are Deep Infinite-Dimensional Non-Mercer Binary Kernel Machines

Despite their ubiquity in core AI fields like natural language processin...
research
06/03/2020

Non-Euclidean Universal Approximation

Modifications to a neural network's input and output layers are often re...
research
06/03/2019

Approximation capability of neural networks on spaces of probability measures and tree-structured domains

This paper extends the proof of density of neural networks in the space ...
research
12/14/2016

Deep Function Machines: Generalized Neural Networks for Topological Layer Expression

In this paper we propose a generalization of deep neural networks called...

Please sign up or login with your details

Forgot password? Click here to reset