DeepAI AI Chat
Log In Sign Up

Hierarchical Models: Intrinsic Separability in High Dimensions

by   Wen-Yan Lin, et al.
Singapore Management University

It has long been noticed that high dimension data exhibits strange patterns. This has been variously interpreted as either a "blessing" or a "curse", causing uncomfortable inconsistencies in the literature. We propose that these patterns arise from an intrinsically hierarchical generative process. Modeling the process creates a web of constraints that reconcile many different theories and results. The model also implies high dimensional data posses an innate separability that can be exploited for machine learning. We demonstrate how this permits the open-set learning problem to be defined mathematically, leading to qualitative and quantitative improvements in performance.


Intrinsic Dimension Estimation

It has long been thought that high-dimensional data encountered in many ...

A Data-dependent Approach for High Dimensional (Robust) Wasserstein Alignment

Many real-world problems can be formulated as the alignment between two ...

The Union of Manifolds Hypothesis and its Implications for Deep Generative Modelling

Deep learning has had tremendous success at learning low-dimensional rep...

Rdimtools: An R package for Dimension Reduction and Intrinsic Dimension Estimation

Discovering patterns of the complex high-dimensional data is a long-stan...

Accelerating the BSM interpretation of LHC data with machine learning

The interpretation of Large Hadron Collider (LHC) data in the framework ...

Hierarchical Annotation of Images with Two-Alternative-Forced-Choice Metric Learning

Many tasks such as retrieval and recommendations can significantly benef...