Quantitative approximation results for complex-valued neural networks

02/25/2021
by   A. Caragea, et al.
0

We show that complex-valued neural networks with the modReLU activation function σ(z) = ReLU(|z| - 1) · z / |z| can uniformly approximate complex-valued functions of regularity C^n on compact subsets of ℂ^d, giving explicit bounds on the approximation rate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/29/2023

Optimal approximation of C^k-functions using shallow complex-valued neural networks

We prove a quantitative result for the approximation of functions of reg...
research
04/18/2021

On the approximation of functions by tanh neural networks

We derive bounds on the error, in high-order Sobolev norms, incurred in ...
research
05/05/2021

Two-layer neural networks with values in a Banach space

We study two-layer neural networks whose domain and range are Banach spa...
research
08/31/2022

Interpolation of Set-Valued Functions

Given a finite number of samples of a continuous set-valued function F, ...
research
03/25/2020

Bayesian Sparsification Methods for Deep Complex-valued Networks

With continual miniaturization ever more applications of deep learning c...
research
04/10/2019

CNM: An Interpretable Complex-valued Network for Matching

This paper seeks to model human language by the mathematical framework o...
research
12/06/2021

Associative Memories Using Complex-Valued Hopfield Networks Based on Spin-Torque Oscillator Arrays

Simulations of complex-valued Hopfield networks based on spin-torque osc...

Please sign up or login with your details

Forgot password? Click here to reset