Likelihood Ratio Exponential Families

12/31/2020
by   Rob Brekelmans, et al.
12

The exponential family is well known in machine learning and statistical physics as the maximum entropy distribution subject to a set of observed constraints, while the geometric mixture path is common in MCMC methods such as annealed importance sampling. Linking these two ideas, recent work has interpreted the geometric mixture path as an exponential family of distributions to analyze the thermodynamic variational objective (TVO). We extend these likelihood ratio exponential families to include solutions to rate-distortion (RD) optimization, the information bottleneck (IB) method, and recent rate-distortion-classification approaches which combine RD and IB. This provides a common mathematical framework for understanding these methods via the conjugate duality of exponential families and hypothesis testing. Further, we collect existing results to provide a variational representation of intermediate RD or TVO distributions as a minimizing an expectation of KL divergences. This solution also corresponds to a size-power tradeoff using the likelihood ratio test and the Neyman Pearson lemma. In thermodynamic integration bounds such as the TVO, we identify the intermediate distribution whose expected sufficient statistics match the log partition function.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/01/2020

All in the Exponential Family: Bregman Duality in Thermodynamic Variational Inference

The recently proposed Thermodynamic Variational Objective (TVO) leverage...
research
12/14/2020

Annealed Importance Sampling with q-Paths

Annealed importance sampling (AIS) is the gold standard for estimating p...
research
07/01/2021

q-Paths: Generalizing the Geometric Annealing Path using Power Means

Many common machine learning methods involve the geometric annealing pat...
research
07/26/2021

Tsallis and Rényi deformations linked via a new λ-duality

Tsallis and Rényi entropies, which are monotone transformations of such ...
research
03/17/2018

Signal detection via Phi-divergences for general mixtures

In this paper we are interested in testing whether there are any signals...
research
06/06/2016

Neural computation from first principles: Using the maximum entropy method to obtain an optimal bits-per-joule neuron

Optimization results are one method for understanding neural computation...
research
02/14/2011

Chernoff information of exponential families

Chernoff information upper bounds the probability of error of the optima...

Please sign up or login with your details

Forgot password? Click here to reset