DeepAI AI Chat
Log In Sign Up

Hierarchical Models: Intrinsic Separability in High Dimensions

03/15/2020
by   Wen-Yan Lin, et al.
Singapore Management University
0

It has long been noticed that high dimension data exhibits strange patterns. This has been variously interpreted as either a "blessing" or a "curse", causing uncomfortable inconsistencies in the literature. We propose that these patterns arise from an intrinsically hierarchical generative process. Modeling the process creates a web of constraints that reconcile many different theories and results. The model also implies high dimensional data posses an innate separability that can be exploited for machine learning. We demonstrate how this permits the open-set learning problem to be defined mathematically, leading to qualitative and quantitative improvements in performance.

READ FULL TEXT
06/08/2021

Intrinsic Dimension Estimation

It has long been thought that high-dimensional data encountered in many ...
09/07/2022

A Data-dependent Approach for High Dimensional (Robust) Wasserstein Alignment

Many real-world problems can be formulated as the alignment between two ...
07/06/2022

The Union of Manifolds Hypothesis and its Implications for Deep Generative Modelling

Deep learning has had tremendous success at learning low-dimensional rep...
05/22/2020

Rdimtools: An R package for Dimension Reduction and Intrinsic Dimension Estimation

Discovering patterns of the complex high-dimensional data is a long-stan...
11/08/2016

Accelerating the BSM interpretation of LHC data with machine learning

The interpretation of Large Hadron Collider (LHC) data in the framework ...
05/23/2019

Hierarchical Annotation of Images with Two-Alternative-Forced-Choice Metric Learning

Many tasks such as retrieval and recommendations can significantly benef...