To Compress or Not to Compress- Self-Supervised Learning and Information Theory: A Review

04/19/2023
by   Ravid Shwartz-Ziv, et al.
0

Deep neural networks have demonstrated remarkable performance in supervised learning tasks but require large amounts of labeled data. Self-supervised learning offers an alternative paradigm, enabling the model to learn from data without explicit labels. Information theory has been instrumental in understanding and optimizing deep neural networks. Specifically, the information bottleneck principle has been applied to optimize the trade-off between compression and relevant information preservation in supervised settings. However, the optimal information objective in self-supervised learning remains unclear. In this paper, we review various approaches to self-supervised learning from an information-theoretic standpoint and present a unified framework that formalizes the self-supervised information-theoretic learning problem. We integrate existing research into a coherent framework, examine recent self-supervised methods, and identify research opportunities and challenges. Moreover, we discuss empirical measurement of information-theoretic quantities and their estimators. This paper offers a comprehensive review of the intersection between information theory, self-supervised learning, and deep neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2023

An Information-Theoretic Perspective on Variance-Invariance-Covariance Regularization

In this paper, we provide an information-theoretic perspective on Varian...
research
03/27/2020

Unpacking Information Bottlenecks: Unifying Information-Theoretic Objectives in Deep Learning

The information bottleneck (IB) principle offers both a mechanism to exp...
research
03/21/2020

On Information Plane Analyses of Neural Network Classifiers – A Review

We review the current literature concerned with information plane analys...
research
02/26/2017

Supervised Learning of Labeled Pointcloud Differences via Cover-Tree Entropy Reduction

We introduce a new algorithm, called CDER, for supervised machine learni...
research
07/20/2022

What Do We Maximize in Self-Supervised Learning?

In this paper, we examine self-supervised learning methods, particularly...
research
02/22/2021

Self-Supervised Learning of Graph Neural Networks: A Unified Review

Deep models trained in supervised mode have achieved remarkable success ...
research
06/06/2021

Self-supervised Rubik's Cube Solver

This work demonstrates that deep neural networks (DNNs) can solve a comb...

Please sign up or login with your details

Forgot password? Click here to reset