Gaussian-Dirichlet Posterior Dominance in Sequential Learning

02/14/2017
by   Ian Osband, et al.
0

We consider the problem of sequential learning from categorical observations bounded in [0,1]. We establish an ordering between the Dirichlet posterior over categorical outcomes and a Gaussian posterior under observations with N(0,1) noise. We establish that, conditioned upon identical data with at least two observations, the posterior mean of the categorical distribution will always second-order stochastically dominate the posterior mean of the Gaussian distribution. These results provide a useful tool for the analysis of sequential learning under categorical outcomes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/26/2020

Gaussian-Dirichlet Random Fields for Inference over High Dimensional Categorical Observations

We propose a generative model for the spatio-temporal distribution of hi...
research
03/02/2020

Fast Predictive Uncertainty for Classification with Bayesian Deep Networks

In Bayesian Deep Learning, distributions over the output of classificati...
research
03/07/2015

Latent Gaussian Processes for Distribution Estimation of Multivariate Categorical Data

Multivariate categorical data occur in many applications of machine lear...
research
05/01/2014

Fast MLE Computation for the Dirichlet Multinomial

Given a collection of categorical data, we want to find the parameters o...
research
07/08/2018

BALSON: Bayesian Least Squares Optimization with Nonnegative L1-Norm Constraint

A Bayesian approach termed BAyesian Least Squares Optimization with Nonn...
research
02/20/2020

The continuous categorical: a novel simplex-valued exponential family

Simplex-valued data appear throughout statistics and machine learning, f...
research
02/01/2020

Deep segmental phonetic posterior-grams based discovery of non-categories in L2 English speech

Second language (L2) speech is often labeled with the native, phone cate...

Please sign up or login with your details

Forgot password? Click here to reset