The Information Bottleneck Problem and Its Applications in Machine Learning

04/30/2020
by   Ziv Goldfeld, et al.
0

Inference capabilities of machine learning (ML) systems skyrocketed in recent years, now playing a pivotal role in various aspect of society. The goal in statistical learning is to use data to obtain simple algorithms for predicting a random variable Y from a correlated observation X. Since the dimension of X is typically huge, computationally feasible solutions should summarize it into a lower-dimensional feature vector T, from which Y is predicted. The algorithm will successfully make the prediction if T is a good proxy of Y, despite the said dimensionality-reduction. A myriad of ML algorithms (mostly employing deep learning (DL)) for finding such representations T based on real-world data are now available. While these methods are often effective in practice, their success is hindered by the lack of a comprehensive theory to explain it. The information bottleneck (IB) theory recently emerged as a bold information-theoretic paradigm for analyzing DL systems. Adopting mutual information as the figure of merit, it suggests that the best representation T should be maximally informative about Y while minimizing the mutual information with X. In this tutorial we survey the information-theoretic origins of this abstract principle, and its recent impact on DL. For the latter, we cover implications of the IB problem on DL theory, as well as practical algorithms inspired by it. Our goal is to provide a unified and cohesive description. A clear view of current knowledge is particularly important for further leveraging IB and other information-theoretic ideas to study DL models.

READ FULL TEXT

page 14

page 23

page 25

research
05/07/2021

A Critical Review of Information Bottleneck Theory and its Applications to Deep Learning

In the past decade, deep neural networks have seen unparalleled improvem...
research
04/11/2022

Machine Learning and Deep Learning – A review for Ecologists

The popularity of Machine learning (ML), Deep learning (DL), and Artific...
research
04/07/2019

Information Bottleneck and its Applications in Deep Learning

Information Theory (IT) has been used in Machine Learning (ML) from earl...
research
05/19/2023

Justices for Information Bottleneck Theory

This study comes as a timely response to mounting criticism of the infor...
research
10/20/2019

Towards Further Understanding of Sparse Filtering via Information Bottleneck

In this paper we examine a formalization of feature distribution learnin...
research
02/21/2018

Information Theoretic Co-Training

This paper introduces an information theoretic co-training objective for...
research
03/30/2018

Understanding Autoencoders with Information Theoretic Concepts

Despite their great success in practical applications, there is still a ...

Please sign up or login with your details

Forgot password? Click here to reset