Augment-and-Conquer Negative Binomial Processes

09/05/2012
by   Mingyuan Zhou, et al.
0

By developing data augmentation methods unique to the negative binomial (NB) distribution, we unite seemingly disjoint count and mixture models under the NB process framework. We develop fundamental properties of the models and derive efficient Gibbs sampling inference. We show that the gamma-NB process can be reduced to the hierarchical Dirichlet process with normalization, highlighting its unique theoretical, structural and computational advantages. A variety of NB processes with distinct sharing mechanisms are constructed and applied to topic modeling, with connections to existing algorithms, showing the importance of inferring both the NB dispersion and probability parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/15/2012

Negative Binomial Process Count and Mixture Modeling

The seemingly disjoint problems of count and mixture modeling are united...
research
03/03/2022

On Data Augmentation for Models Involving Reciprocal Gamma Functions

In this paper, we introduce a new and efficient data augmentation approa...
research
06/03/2021

Bayesian Inference for Gamma Models

We use the theory of normal variance-mean mixtures to derive a data augm...
research
04/23/2020

A Gamma-Poisson Mixture Topic Model for Short Text

Most topic models are constructed under the assumption that documents fo...
research
04/30/2019

Models for Genetic Diversity Generated by Negative Binomial Point Processes

We develop a model based on a generalised Poisson-Dirichlet distribution...
research
10/09/2013

Improved Bayesian Logistic Supervised Topic Models with Data Augmentation

Supervised topic models with a logistic likelihood have two issues that ...
research
03/07/2018

Differential Expression Analysis of Dynamical Sequencing Count Data with a Gamma Markov Chain

Next-generation sequencing (NGS) to profile temporal changes in living s...

Please sign up or login with your details

Forgot password? Click here to reset