Quantitative approximation results for complex-valued neural networks

02/25/2021
by   A. Caragea, et al.
0

We show that complex-valued neural networks with the modReLU activation function σ(z) = ReLU(|z| - 1) · z / |z| can uniformly approximate complex-valued functions of regularity C^n on compact subsets of ℂ^d, giving explicit bounds on the approximation rate.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro