DeepAI AI Chat
Log In Sign Up

The Distributed Information Bottleneck reveals the explanatory structure of complex systems

04/15/2022
by   Kieran A. Murphy, et al.
University of Pennsylvania
0

The fruits of science are relationships made comprehensible, often by way of approximation. While deep learning is an extremely powerful way to find relationships in data, its use in science has been hindered by the difficulty of understanding the learned relationships. The Information Bottleneck (IB) is an information theoretic framework for understanding a relationship between an input and an output in terms of a trade-off between the fidelity and complexity of approximations to the relationship. Here we show that a crucial modification – distributing bottlenecks across multiple components of the input – opens fundamentally new avenues for interpretable deep learning in science. The Distributed Information Bottleneck throttles the downstream complexity of interactions between the components of the input, deconstructing a relationship into meaningful approximations found through deep learning without requiring custom-made datasets or neural network architectures. Applied to a complex system, the approximations illuminate aspects of the system's nature by restricting – and monitoring – the information about different components incorporated into the approximation. We demonstrate the Distributed IB's explanatory utility in systems drawn from applied mathematics and condensed matter physics. In the former, we deconstruct a Boolean circuit into approximations that isolate the most informative subsets of input components without requiring exhaustive search. In the latter, we localize information about future plastic rearrangement in the static structure of a sheared glass, and find the information to be more or less diffuse depending on the system's preparation. By way of a principled scheme of approximations, the Distributed IB brings much-needed interpretability to deep learning and enables unprecedented analysis of information flow through a system.

READ FULL TEXT

page 5

page 7

page 11

11/30/2022

Interpretability with full complexity by constraining feature information

Interpretability is a pressing issue for machine learning. Common approa...
02/19/2019

Explaining a black-box using Deep Variational Information Bottleneck Approach

Briefness and comprehensiveness are necessary in order to give a lot of ...
03/27/2020

Unpacking Information Bottlenecks: Unifying Information-Theoretic Objectives in Deep Learning

The information bottleneck (IB) principle offers both a mechanism to exp...
03/23/2016

A guide to convolution arithmetic for deep learning

We introduce a guide to help deep learning practitioners understand and ...
05/04/2020

Off-the-shelf deep learning is not enough: parsimony, Bayes and causality

Deep neural networks ("deep learning") have emerged as a technology of c...
04/26/2021

A Survey Of Regression Algorithms And Connections With Deep Learning

Regression has attracted immense interest lately due to its effectivenes...

Code Repositories

distributed-ib

Code for The Distributed Information Bottleneck reveals the explanatory structure of complex systems (https://arxiv.org/abs/2204.07576)


view repo