Farewell to Mutual Information: Variational Distillation for Cross-Modal Person Re-Identification

04/07/2021
by   Xudong Tian, et al.
0

The Information Bottleneck (IB) provides an information theoretic principle for representation learning, by retaining all information relevant for predicting label while minimizing the redundancy. Though IB principle has been applied to a wide range of applications, its optimization remains a challenging problem which heavily relies on the accurate estimation of mutual information. In this paper, we present a new strategy, Variational Self-Distillation (VSD), which provides a scalable, flexible and analytic solution to essentially fitting the mutual information but without explicitly estimating it. Under rigorously theoretical guarantee, VSD enables the IB to grasp the intrinsic correlation between representation and label for supervised training. Furthermore, by extending VSD to multi-view learning, we introduce two other strategies, Variational Cross-Distillation (VCD) and Variational Mutual-Learning (VML), which significantly improve the robustness of representation to view-changes by eliminating view-specific and task-irrelevant information. To verify our theoretically grounded strategies, we apply our approaches to cross-modal person Re-ID, and conduct extensive experiments, where the superior performance against state-of-the-art methods are demonstrated. Our intriguing findings highlight the need to rethink the way to estimate mutual

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2022

Variational Distillation for Multi-View Learning

Information Bottleneck (IB) based multi-view learning provides an inform...
research
08/26/2022

CMD: Self-supervised 3D Action Representation Learning with Cross-modal Mutual Distillation

In 3D action recognition, there exists rich complementary information be...
research
06/13/2023

Enhanced Multimodal Representation Learning with Cross-modal KD

This paper explores the tasks of leveraging auxiliary modalities which a...
research
02/17/2020

Learning Robust Representations via Multi-View Information Bottleneck

The information bottleneck principle provides an information-theoretic m...
research
06/09/2020

Neural Methods for Point-wise Dependency Estimation

Since its inception, the neural estimation of mutual information (MI) ha...
research
10/25/2021

MUSE: Feature Self-Distillation with Mutual Information and Self-Information

We present a novel information-theoretic approach to introduce dependenc...
research
12/12/2019

General Information Bottleneck Objectives and their Applications to Machine Learning

We view the Information Bottleneck Principle (IBP: Tishby et al., 1999; ...

Please sign up or login with your details

Forgot password? Click here to reset