Learning Diverse and Discriminative Representations via the Principle of Maximal Coding Rate Reduction

06/15/2020
by   Yaodong Yu, et al.
0

To learn intrinsic low-dimensional structures from high-dimensional data that most discriminate between classes, we propose the principle of Maximal Coding Rate Reduction (MCR^2), an information-theoretic measure that maximizes the coding rate difference between the whole dataset and the sum of each individual class. We clarify its relationships with most existing frameworks such as cross-entropy, information bottleneck, information gain, contractive and contrastive learning, and provide theoretical guarantees for learning diverse and discriminative features. The coding rate can be accurately computed from finite samples of degenerate subspace-like distributions and can learn intrinsic representations in supervised, self-supervised, and unsupervised settings in a unified manner. Empirically, the representations learned using this principle alone are significantly more robust to label corruptions in classification than those using cross-entropy, and can lead to state-of-the-art results in clustering mixed data from self-learned invariant features.

READ FULL TEXT
research
10/01/2022

Federated Representation Learning via Maximal Coding Rate Reduction

We propose a federated methodology to learn low-dimensional representati...
research
03/31/2022

Efficient Maximal Coding Rate Reduction by Variational Forms

The principle of Maximal Coding Rate Reduction (MCR^2) has recently been...
research
01/14/2021

Label Contrastive Coding based Graph Neural Network for Graph Classification

Graph classification is a critical research problem in many applications...
research
05/29/2021

Modeling Discriminative Representations for Out-of-Domain Detection with Supervised Contrastive Learning

Detecting Out-of-Domain (OOD) or unknown intents from user queries is es...
research
04/01/2022

WavFT: Acoustic model finetuning with labelled and unlabelled data

Unsupervised and self-supervised learning methods have leveraged unlabel...
research
01/29/2017

Supervised Deep Sparse Coding Networks

In this paper, we describe the deep sparse coding network (SCN), a novel...

Please sign up or login with your details

Forgot password? Click here to reset