Deep Neural Networks for Rotation-Invariance Approximation and Learning

04/03/2019
by   Charles K. Chui, et al.
0

Based on the tree architecture, the objective of this paper is to design deep neural networks with two or more hidden layers (called deep nets) for realization of radial functions so as to enable rotational invariance for near-optimal function approximation in an arbitrarily high dimensional Euclidian space. It is shown that deep nets have much better performance than shallow nets (with only one hidden layer) in terms of approximation accuracy and learning capabilities. In particular, for learning radial functions, it is shown that near-optimal rate can be achieved by deep nets but not by shallow nets. Our results illustrate the necessity of depth in neural network design for realization of rotation-invariance target functions.

READ FULL TEXT
research
01/13/2020

Approximation smooth and sparse functions by deep neural networks without saturation

Constructing neural networks for function approximation is a classical a...
research
03/09/2018

Construction of neural networks for realization of localized deep learning

The subject of deep learning has recently attracted users of machine lea...
research
01/01/2019

Realizing data features by deep nets

This paper considers the power of deep neural networks (deep nets for sh...
research
12/16/2019

Realization of spatial sparseness by deep ReLU nets with massive data

The great success of deep learning poses urgent challenges for understan...
research
03/10/2018

Generalization and Expressivity for Deep Nets

Along with the rapid development of deep learning in practice, the theor...
research
02/02/2021

Depth separation beyond radial functions

High-dimensional depth separation results for neural networks show that ...
research
08/06/2020

ReLU nets adapt to intrinsic dimensionality beyond the target domain

We study the approximation of two-layer compositions f(x) = g(ϕ(x)) via ...

Please sign up or login with your details

Forgot password? Click here to reset