Any Target Function Exists in a Neighborhood of Any Sufficiently Wide Random Network: A Geometrical Perspective

01/20/2020
by   Shun-ichi Amari, et al.
0

It is known that any target function is realized in a sufficiently small neighborhood of any randomly connected deep network, provided the width (the number of neurons in a layer) is sufficiently large. There are sophisticated theories and discussions concerning this striking fact, but rigorous theories are very complicated. We give an elementary geometrical proof by using a simple model for the purpose of elucidating its structure. We show that high-dimensional geometry plays a magical role: When we project a high-dimensional sphere of radius 1 to a low-dimensional subspace, the uniform distribution over the sphere reduces to a Gaussian distribution of negligibly small covariances.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/04/2019

Theory of high-dimensional outliers

This study concerns the issue of high dimensional outliers which are cha...
research
11/20/2018

An Isoperimetric Result on High-Dimensional Spheres

We consider an extremal problem for subsets of high-dimensional spheres ...
research
08/25/2020

Deep Networks and the Multiple Manifold Problem

We study the multiple manifold problem, a binary classification task mod...
research
01/19/2021

Exchangeable Bernoulli distributions: high dimensional simulation, estimate and testing

We explore the class of exchangeable Bernoulli distributions building on...
research
07/29/2021

Deep Networks Provably Classify Data on Curves

Data with low-dimensional nonlinear structure are ubiquitous in engineer...
research
11/16/2022

On the symmetries in the dynamics of wide two-layer neural networks

We consider the idealized setting of gradient flow on the population ris...

Please sign up or login with your details

Forgot password? Click here to reset