Visualizing Information Bottleneck through Variational Inference

12/24/2022
by   Cipta Herwana, et al.
0

The Information Bottleneck theory provides a theoretical and computational framework for finding approximate minimum sufficient statistics. Analysis of the Stochastic Gradient Descent (SGD) training of a neural network on a toy problem has shown the existence of two phases, fitting and compression. In this work, we analyze the SGD training process of a Deep Neural Network on MNIST classification and confirm the existence of two phases of SGD training. We also propose a setup for estimating the mutual information for a Deep Neural Network through Variational Inference.

READ FULL TEXT
research
05/15/2021

Drill the Cork of Information Bottleneck by Inputting the Most Important Data

Deep learning has become the most powerful machine learning tool in the ...
research
02/08/2016

A Variational Analysis of Stochastic Gradient Algorithms

Stochastic Gradient Descent (SGD) is an important algorithm in machine l...
research
10/30/2017

Stochastic gradient descent performs variational inference, converges to limit cycles for deep networks

Stochastic gradient descent (SGD) is widely believed to perform implicit...
research
01/02/2019

SGD Converges to Global Minimum in Deep Learning via Star-convex Path

Stochastic gradient descent (SGD) has been found to be surprisingly effe...
research
05/28/2019

SGD on Neural Networks Learns Functions of Increasing Complexity

We perform an experimental study of the dynamics of Stochastic Gradient ...
research
04/29/2021

Learning Robust Variational Information Bottleneck with Reference

We propose a new approach to train a variational information bottleneck ...
research
03/22/2021

Data Cleansing for Deep Neural Networks with Storage-efficient Approximation of Influence Functions

Identifying the influence of training data for data cleansing can improv...

Please sign up or login with your details

Forgot password? Click here to reset