Intrinsic Dimension Estimation

06/08/2021
by   Adam Block, et al.
12

It has long been thought that high-dimensional data encountered in many practical machine learning tasks have low-dimensional structure, i.e., the manifold hypothesis holds. A natural question, thus, is to estimate the intrinsic dimension of a given population distribution from a finite sample. We introduce a new estimator of the intrinsic dimension and provide finite sample, non-asymptotic guarantees. We then apply our techniques to get new sample complexity bounds for Generative Adversarial Networks (GANs) depending only on the intrinsic dimension of the data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/11/2019

Non-Parametric Inference Adaptive to Intrinsic Dimension

We consider non-parametric estimation and inference of conditional momen...
research
02/25/2023

On Deep Generative Models for Approximation and Estimation of Distributions on Manifolds

Generative networks have experienced great empirical successes in distri...
research
07/06/2022

The Union of Manifolds Hypothesis and its Implications for Deep Generative Modelling

Deep learning has had tremendous success at learning low-dimensional rep...
research
03/15/2020

Hierarchical Models: Intrinsic Separability in High Dimensions

It has long been noticed that high dimension data exhibits strange patte...
research
04/18/2021

The Intrinsic Dimension of Images and Its Impact on Learning

It is widely believed that natural image data exhibits low-dimensional s...
research
06/01/2019

Graph-based Discriminators: Sample Complexity and Expressiveness

A basic question in learning theory is to identify if two distributions ...
research
10/30/2020

Empirical or Invariant Risk Minimization? A Sample Complexity Perspective

Recently, invariant risk minimization (IRM) was proposed as a promising ...

Please sign up or login with your details

Forgot password? Click here to reset