DeepAI
Log In Sign Up

Intrinsic Dimension Estimation

06/08/2021
by   Adam Block, et al.
12

It has long been thought that high-dimensional data encountered in many practical machine learning tasks have low-dimensional structure, i.e., the manifold hypothesis holds. A natural question, thus, is to estimate the intrinsic dimension of a given population distribution from a finite sample. We introduce a new estimator of the intrinsic dimension and provide finite sample, non-asymptotic guarantees. We then apply our techniques to get new sample complexity bounds for Generative Adversarial Networks (GANs) depending only on the intrinsic dimension of the data.

READ FULL TEXT

page 1

page 2

page 3

page 4

01/11/2019

Non-Parametric Inference Adaptive to Intrinsic Dimension

We consider non-parametric estimation and inference of conditional momen...
07/06/2022

The Union of Manifolds Hypothesis and its Implications for Deep Generative Modelling

Deep learning has had tremendous success at learning low-dimensional rep...
03/15/2020

Hierarchical Models: Intrinsic Separability in High Dimensions

It has long been noticed that high dimension data exhibits strange patte...
04/18/2021

The Intrinsic Dimension of Images and Its Impact on Learning

It is widely believed that natural image data exhibits low-dimensional s...
06/01/2019

Graph-based Discriminators: Sample Complexity and Expressiveness

A basic question in learning theory is to identify if two distributions ...
10/30/2020

Empirical or Invariant Risk Minimization? A Sample Complexity Perspective

Recently, invariant risk minimization (IRM) was proposed as a promising ...
05/04/2022

A Manifold Two-Sample Test Study: Integral Probability Metric with Neural Networks

Two-sample tests are important areas aiming to determine whether two col...