Label Embedding by Johnson-Lindenstrauss Matrices

05/31/2023
by   Jianxin Zhang, et al.
0

We present a simple and scalable framework for extreme multiclass classification based on Johnson-Lindenstrauss matrices (JLMs). Using the columns of a JLM to embed the labels, a C-class classification problem is transformed into a regression problem with (log C) output dimension. We derive an excess risk bound, revealing a tradeoff between computational efficiency and prediction accuracy, and further show that under the Massart noise condition, the penalty for dimension reduction vanishes. Our approach is easily parallelizable, and experimental results demonstrate its effectiveness and scalability in large-scale applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/03/2019

Transformed Central Quantile Subspace

We present a dimension reduction technique for the conditional quantiles...
research
10/09/2022

Nonlinear Sufficient Dimension Reduction with a Stochastic Neural Network

Sufficient dimension reduction is a powerful tool to extract core inform...
research
03/07/2023

Sufficient dimension reduction for feature matrices

We address the problem of sufficient dimension reduction for feature mat...
research
11/27/2014

Classification with Noisy Labels by Importance Reweighting

In this paper, we study a classification problem in which sample labels ...
research
03/08/2018

Efficient Loss-Based Decoding On Graphs For Extreme Classification

In extreme classification problems, learning algorithms are required to ...
research
04/20/2023

Light-weight Deep Extreme Multilabel Classification

Extreme multi-label (XML) classification refers to the task of supervise...
research
03/22/2020

Multi-target regression via output space quantization

Multi-target regression is concerned with the prediction of multiple con...

Please sign up or login with your details

Forgot password? Click here to reset