SRP: Efficient class-aware embedding learning for large-scale data via supervised random projections

11/07/2018
by   Amir-Hossein Karimi, et al.
0

Supervised dimensionality reduction strategies have been of great interest. However, current supervised dimensionality reduction approaches are difficult to scale for situations characterized by large datasets given the high computational complexities associated with such methods. While stochastic approximation strategies have been explored for unsupervised dimensionality reduction to tackle this challenge, such approaches are not well-suited for accelerating computational speed for supervised dimensionality reduction. Motivated to tackle this challenge, in this study we explore a novel direction of directly learning optimal class-aware embeddings in a supervised manner via the notion of supervised random projections (SRP). The key idea behind SRP is that, rather than performing spectral decomposition (or approximations thereof) which are computationally prohibitive for large-scale data, we instead perform a direct decomposition by leveraging kernel approximation theory and the symmetry of the Hilbert-Schmidt Independence Criterion (HSIC) measure of dependence between the embedded data and the labels. Experimental results on five different synthetic and real-world datasets demonstrate that the proposed SRP strategy for class-aware embedding learning can be very promising in producing embeddings that are highly competitive with existing supervised dimensionality reduction methods (e.g., SPCA and KSPCA) while achieving 1-2 orders of magnitude better computational performance. As such, such an efficient approach to learning embeddings for dimensionality reduction can be a powerful tool for large-scale data analysis and visualization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/10/2021

A Local Similarity-Preserving Framework for Nonlinear Dimensionality Reduction with Neural Networks

Real-world data usually have high dimensionality and it is important to ...
research
02/21/2019

Deep Learning Multidimensional Projections

Dimensionality reduction methods, also known as projections, are frequen...
research
11/11/2022

Inverse Kernel Decomposition

The state-of-the-art dimensionality reduction approaches largely rely on...
research
12/24/2018

bigMap: Big Data Mapping with Parallelized t-SNE

We introduce an improved unsupervised clustering protocol specially suit...
research
03/04/2020

Visualizing and Understanding Large-Scale Assessments in Mathematics through Dimensionality Reduction

In this paper, we apply the Logistic PCA (LPCA) as a dimensionality redu...
research
11/21/2022

A Generalized EigenGame with Extensions to Multiview Representation Learning

Generalized Eigenvalue Problems (GEPs) encompass a range of interesting ...
research
10/03/2017

DimReader: Using auto-differentiation to explain non-linear projections

Non-linear dimensionality reduction (NDR) methods such as LLE and t-SNE ...

Please sign up or login with your details

Forgot password? Click here to reset