Variational Predictive Information Bottleneck

10/23/2019
by   Alexander A. Alemi, et al.
0

In classic papers, Zellner demonstrated that Bayesian inference could be derived as the solution to an information theoretic functional. Below we derive a generalized form of this functional as a variational lower bound of a predictive information bottleneck objective. This generalized functional encompasses most modern inference procedures and suggests novel ones.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/07/2020

The Variational InfoMax Learning Objective

Bayesian Inference and Information Bottleneck are the two most popular o...
research
11/06/2019

Machine Learning using the Variational Predictive Information Bottleneck with a Validation Set

Zellner (1988) modeled statistical inference in terms of information pro...
research
11/19/2008

Deformed Statistics Formulation of the Information Bottleneck Method

The theoretical basis for a candidate variational principle for the info...
research
05/24/2016

Relevant sparse codes with variational information bottleneck

In many applications, it is desirable to extract only the relevant aspec...
research
10/27/2018

The Variational Deficiency Bottleneck

We introduce a bottleneck method for learning data representations based...
research
01/27/2021

Variational Encoders and Autoencoders : Information-theoretic Inference and Closed-form Solutions

This work develops problem statements related to encoders and autoencode...
research
08/03/2018

Cortical Microcircuits from a Generative Vision Model

Understanding the information processing roles of cortical circuits is a...

Please sign up or login with your details

Forgot password? Click here to reset