DeepAI AI Chat
Log In Sign Up

Deep Sufficient Representation Learning via Mutual Information

07/21/2022
by   Siming Zheng, et al.
0

We propose a mutual information-based sufficient representation learning (MSRL) approach, which uses the variational formulation of the mutual information and leverages the approximation power of deep neural networks. MSRL learns a sufficient representation with the maximum mutual information with the response and a user-selected distribution. It can easily handle multi-dimensional continuous or categorical response variables. MSRL is shown to be consistent in the sense that the conditional probability density function of the response variable given the learned representation converges to the conditional probability density function of the response variable given the predictor. Non-asymptotic error bounds for MSRL are also established under suitable conditions. To establish the error bounds, we derive a generalized Dudley's inequality for an order-two U-process indexed by deep neural networks, which may be of independent interest. We discuss how to determine the intrinsic dimension of the underlying data distribution. Moreover, we evaluate the performance of MSRL via extensive numerical experiments and real data analysis and demonstrate that MSRL outperforms some existing nonlinear sufficient dimension reduction methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

02/21/2018

Learning to Explain: An Information-Theoretic Perspective on Model Interpretation

We introduce instancewise feature selection as a methodology for model i...
06/10/2020

Deep Dimension Reduction for Supervised Representation Learning

The success of deep supervised learning depends on its automatic data re...
07/04/2022

Representation Learning with Information Theory for COVID-19 Detection

Successful data representation is a fundamental factor in machine learni...
02/25/2020

A Theory of Usable Information Under Computational Constraints

We propose a new framework for reasoning about information in complex sy...
10/19/2021

A Deep Generative Approach to Conditional Sampling

We propose a deep generative approach to sampling from a conditional dis...
11/09/2017

A Separation Principle for Control in the Age of Deep Learning

We review the problem of defining and inferring a "state" for a control ...
12/04/2022

Statistical Physics of Deep Neural Networks: Initialization toward Optimal Channels

In deep learning, neural networks serve as noisy channels between input ...