Universal approximation properties of shallow quadratic neural networks

10/04/2021
by   Leon Frischauf, et al.
0

In this paper we propose a new class of neural network functions which are linear combinations of compositions of activation functions with quadratic functions, replacing standard affine linear functions, often called neurons. We show the universality of this approximation and prove convergence rates results based on the theory of wavelets and statistical learning. We investigate the efficiency of our new approach numerically for simple test cases, by showing that it requires less numbers of neurons that standard affine linear neural networks. Similar observations are made when comparing deep (multi-layer) networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/29/2019

Approximation of functions by neural networks

We study the approximation of measurable functions on the hypercube by f...
research
10/28/2020

Estimating Multiplicative Relations in Neural Networks

Universal approximation theorem suggests that a shallow neural network c...
research
06/29/2022

Automatic Synthesis of Neurons for Recurrent Neural Nets

We present a new class of neurons, ARNs, which give a cross entropy on t...
research
12/02/2022

On Solution Functions of Optimization: Universal Approximation and Covering Number Bounds

We study the expressibility and learnability of convex optimization solu...
research
06/03/2016

Dense Associative Memory for Pattern Recognition

A model of associative memory is studied, which stores and reliably retr...
research
11/02/2021

Two neural-network-based methods for solving obstacle problems

Two neural-network-based numerical schemes are proposed to solve the cla...
research
02/09/2017

Energy Saving Additive Neural Network

In recent years, machine learning techniques based on neural networks fo...

Please sign up or login with your details

Forgot password? Click here to reset