A simple non-parametric Topic Mixture for Authors and Documents

11/27/2012
by   Arnim Bleier, et al.
0

This article reviews the Author-Topic Model and presents a new non-parametric extension based on the Hierarchical Dirichlet Process. The extension is especially suitable when no prior information about the number of components necessary is available. A blocked Gibbs sampler is described and focus put on staying as close as possible to the original model with only the minimum of theoretical and implementation overhead necessary.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/26/2015

Nested Hierarchical Dirichlet Processes for Multi-Level Non-Parametric Admixture Modeling

Dirichlet Process(DP) is a Bayesian non-parametric prior for infinite mi...
research
09/22/2016

Bibliographic Analysis with the Citation Network Topic Model

Bibliographic analysis considers author's research areas, the citation n...
research
08/22/2012

A non-parametric mixture model for topic modeling over time

A single, stationary topic model such as latent Dirichlet allocation is ...
research
01/20/2016

Hierarchical Latent Word Clustering

This paper presents a new Bayesian non-parametric model by extending the...
research
06/27/2012

A Non-Parametric Bayesian Method for Inferring Hidden Causes

We present a non-parametric Bayesian approach to structure learning with...
research
06/27/2016

Dynamic Hierarchical Dirichlet Process for Abnormal Behaviour Detection in Video

This paper proposes a novel dynamic Hierarchical Dirichlet Process topic...
research
11/30/2020

A Framework for Authorial Clustering of Shorter Texts in Latent Semantic Spaces

Authorial clustering involves the grouping of documents written by the s...

Please sign up or login with your details

Forgot password? Click here to reset