Expressivity of expand-and-sparsify representations

06/05/2020
by   Sanjoy Dasgupta, et al.
0

A simple sparse coding mechanism appears in the sensory systems of several organisms: to a coarse approximation, an input x ∈^d is mapped to much higher dimension m ≫ d by a random linear transformation, and is then sparsified by a winner-take-all process in which only the positions of the top k values are retained, yielding a k-sparse vector z ∈{0,1}^m. We study the benefits of this representation for subsequent learning. We first show a universal approximation property, that arbitrary continuous functions of x are well approximated by linear functions of z, provided m is large enough. This can be interpreted as saying that z unpacks the information in x and makes it more readily accessible. The linear functions can be specified explicitly and are easy to learn, and we give bounds on how large m needs to be as a function of the input dimension d and the smoothness of the target function. Next, we consider whether the representation is adaptive to manifold structure in the input space. This is highly dependent on the specific method of sparsification: we show that adaptivity is not obtained under the winner-take-all mechanism, but does hold under a slight variant. Finally we consider mappings to the representation space that are random but are attuned to the data distribution, and we give favorable approximation bounds in this setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/17/2020

Sparse-grid sampling recovery and deep ReLU neural networks in high-dimensional approximation

We investigate approximations of functions from the Hölder-Zygmund space...
research
02/24/2021

Approximating the Derivative of Manifold-valued Functions

We consider the approximation of manifold-valued functions by embedding ...
research
10/28/2019

Deep learning is adaptive to intrinsic dimensionality of model smoothness in anisotropic Besov space

Deep learning has exhibited superior performance for various tasks, espe...
research
02/08/2021

High-dimensional nonlinear approximation by parametric manifolds in Hölder-Nikol'skii spaces of mixed smoothness

We study high-dimensional nonlinear approximation of functions in Hölder...
research
05/09/2022

Exponential tractability of L_2-approximation with function values

We study the complexity of high-dimensional approximation in the L_2-nor...
research
12/30/2022

An Entropy-Based Model for Hierarchical Learning

Machine learning is the dominant approach to artificial intelligence, th...
research
06/16/2020

Risk bounds when learning infinitely many response functions by ordinary linear regression

Consider the problem of learning a large number of response functions si...

Please sign up or login with your details

Forgot password? Click here to reset