DeepAI AI Chat
Log In Sign Up

Any Target Function Exists in a Neighborhood of Any Sufficiently Wide Random Network: A Geometrical Perspective

by   Shun-ichi Amari, et al.

It is known that any target function is realized in a sufficiently small neighborhood of any randomly connected deep network, provided the width (the number of neurons in a layer) is sufficiently large. There are sophisticated theories and discussions concerning this striking fact, but rigorous theories are very complicated. We give an elementary geometrical proof by using a simple model for the purpose of elucidating its structure. We show that high-dimensional geometry plays a magical role: When we project a high-dimensional sphere of radius 1 to a low-dimensional subspace, the uniform distribution over the sphere reduces to a Gaussian distribution of negligibly small covariances.


page 1

page 2

page 3

page 4


Theory of high-dimensional outliers

This study concerns the issue of high dimensional outliers which are cha...

An Isoperimetric Result on High-Dimensional Spheres

We consider an extremal problem for subsets of high-dimensional spheres ...

Deep Networks and the Multiple Manifold Problem

We study the multiple manifold problem, a binary classification task mod...

Exchangeable Bernoulli distributions: high dimensional simulation, estimate and testing

We explore the class of exchangeable Bernoulli distributions building on...

Deep Networks Provably Classify Data on Curves

Data with low-dimensional nonlinear structure are ubiquitous in engineer...

Robustness of statistical models

A statistical structure (g, T) on a smooth manifold M induced by (M̃, ...