
Efficient Approximation of Deep ReLU Networks for Functions on Low Dimensional Manifolds
Deep neural networks have revolutionized many real world applications, d...
read it

Doubly Robust OffPolicy Learning on LowDimensional Manifolds by Deep Neural Networks
Causal inference explores the causation between actions and the conseque...
read it

Geodesic Centroidal Voronoi Tessellations: Theories, Algorithms and Applications
Nowadays, big data of digital media (including images, videos and 3D gra...
read it

Adaptive Geometric Multiscale Approximations for Intrinsically Lowdimensional Data
We consider the problem of efficiently approximating and encoding highd...
read it

Bigeometric Organization of Deep Nets
In this paper, we build an organization of highdimensional datasets tha...
read it

Multiscale regression on unknown manifolds
We consider the regression problem of estimating functions on ℝ^D but su...
read it

The Intrinsic Dimension of Images and Its Impact on Learning
It is widely believed that natural image data exhibits lowdimensional s...
read it
Besov Function Approximation and Binary Classification on LowDimensional Manifolds Using Convolutional Residual Networks
Most of existing statistical theories on deep neural networks have sample complexities cursed by the data dimension and therefore cannot well explain the empirical success of deep learning on highdimensional data. To bridge this gap, we propose to exploit lowdimensional geometric structures of the real world data sets. We establish theoretical guarantees of convolutional residual networks (ConvResNet) in terms of function approximation and statistical estimation for binary classification. Specifically, given the data lying on a ddimensional manifold isometrically embedded in ℝ^D, we prove that if the network architecture is properly chosen, ConvResNets can (1) approximate Besov functions on manifolds with arbitrary accuracy, and (2) learn a classifier by minimizing the empirical logistic risk, which gives an excess risk in the order of n^s/2s+2(s∨ d), where s is a smoothness parameter. This implies that the sample complexity depends on the intrinsic dimension d, instead of the data dimension D. Our results demonstrate that ConvResNets are adaptive to lowdimensional structures of data sets.
READ FULL TEXT
Comments
There are no comments yet.