The continuous categorical: a novel simplex-valued exponential family

02/20/2020
by   Elliott Gordon-Rodriguez, et al.
0

Simplex-valued data appear throughout statistics and machine learning, for example in the context of transfer learning and compression of deep networks. Existing models for this class of data rely on the Dirichlet distribution or other related loss functions; here we show these standard choices suffer systematically from a number of limitations, including bias and numerical issues that frustrate the use of flexible network models upstream of these distributions. We resolve these limitations by introducing a novel exponential family of distributions for modeling simplex-valued data - the continuous categorical, which arises as a nontrivial multivariate generalization of the recently discovered continuous Bernoulli. Unlike the Dirichlet and other typical choices, the continuous categorical results in a well-behaved probabilistic loss function that produces unbiased estimators, while preserving the mathematical simplicity of the Dirichlet. As well as exploring its theoretical properties, we introduce sampling methods for this distribution that are amenable to the reparameterization trick, and evaluate their performance. Lastly, we demonstrate that the continuous categorical outperforms standard choices empirically, across a simulation study, an applied example on multi-party elections, and a neural network compression task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/28/2022

On the Normalizing Constant of the Continuous Categorical Distribution

Probability distributions supported on the simplex enjoy a wide range of...
research
10/26/2022

Categorical SDEs with Simplex Diffusion

Diffusion models typically operate in the standard framework of generati...
research
03/02/2020

Fast Predictive Uncertainty for Classification with Bayesian Deep Networks

In Bayesian Deep Learning, distributions over the output of classificati...
research
11/10/2020

Uses and Abuses of the Cross-Entropy Loss: Case Studies in Modern Deep Learning

Modern deep learning is primarily an experimental science, in which empi...
research
02/14/2017

Gaussian-Dirichlet Posterior Dominance in Sequential Learning

We consider the problem of sequential learning from categorical observat...
research
12/14/2020

Morphology on categorical distributions

The categorical distribution is a natural representation of uncertainty ...
research
11/05/2019

A Conway-Maxwell-Multinomial Distribution for Flexible Modeling of Clustered Categorical Data

Categorical data are often observed as counts resulting from a fixed num...

Please sign up or login with your details

Forgot password? Click here to reset