A global universality of two-layer neural networks with ReLU activations

11/20/2020
by   Naoya Hatano, et al.
0

In the present study, we investigate a universality of neural networks, which concerns a density of the set of two-layer neural networks in a function spaces. There are many works that handle the convergence over compact sets. In the present paper, we consider a global convergence by introducing a norm suitably, so that our results will be uniform over any compact set.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2021

Computation complexity of deep ReLU neural networks in high-dimensional approximation

The purpose of the present paper is to study the computation complexity ...
research
09/28/2021

Convergence of Deep Convolutional Neural Networks

Convergence of deep neural networks as the depth of the networks tends t...
research
07/30/2020

On the Banach spaces associated with multi-layer ReLU networks: Function representation, approximation theory and gradient descent dynamics

We develop Banach spaces for ReLU neural networks of finite depth L and ...
research
02/06/2020

Global Convergence of Frank Wolfe on One Hidden Layer Networks

We derive global convergence bounds for the Frank Wolfe algorithm when t...
research
06/20/2018

Learning ReLU Networks via Alternating Minimization

We propose and analyze a new family of algorithms for training neural ne...
research
06/10/2020

Representation formulas and pointwise properties for Barron functions

We study the natural function space for infinitely wide two-layer neural...
research
05/05/2021

Two-layer neural networks with values in a Banach space

We study two-layer neural networks whose domain and range are Banach spa...

Please sign up or login with your details

Forgot password? Click here to reset