Learning Robust Representations via Multi-View Information Bottleneck

02/17/2020
by   Marco Federici, et al.
34

The information bottleneck principle provides an information-theoretic method for representation learning, by training an encoder to retain all information which is relevant for predicting the label while minimizing the amount of other, excess information in the representation. The original formulation, however, requires labeled data to identify the superfluous information. In this work, we extend this ability to the multi-view unsupervised setting, where two views of the same underlying entity are provided but the label is unknown. This enables us to identify superfluous information as that not shared by both views. A theoretical analysis leads to the definition of a new multi-view model that produces state-of-the-art results on the Sketchy dataset and label-limited versions of the MIR-Flickr dataset. We also extend our theory to the single-view setting by taking advantage of standard data augmentation techniques, empirically showing better generalization capabilities when compared to common unsupervised approaches for representation learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/20/2023

Multi-view Fuzzy Representation Learning with Rules based Model

Unsupervised multi-view representation learning has been extensively stu...
research
07/03/2023

Semi-supervised multi-view concept decomposition

Concept Factorization (CF), as a novel paradigm of representation learni...
research
06/20/2022

Variational Distillation for Multi-View Learning

Information Bottleneck (IB) based multi-view learning provides an inform...
research
02/06/2022

On the Multi-View Information Bottleneck Representation

In this work, we generalize the information bottleneck (IB) approach to ...
research
05/19/2016

Inter-Battery Topic Representation Learning

In this paper, we present the Inter-Battery Topic Model (IBTM). Our appr...
research
03/16/2023

Identifiability Results for Multimodal Contrastive Learning

Contrastive learning is a cornerstone underlying recent progress in mult...
research
04/07/2021

Farewell to Mutual Information: Variational Distillation for Cross-Modal Person Re-Identification

The Information Bottleneck (IB) provides an information theoretic princi...

Please sign up or login with your details

Forgot password? Click here to reset