Neural Sinkhorn Topic Model

08/12/2020
by   He Zhao, et al.
0

In this paper, we present a new topic modelling approach via the theory of optimal transport (OT). Specifically, we present a document with two distributions: a distribution over the words (doc-word distribution) and a distribution over the topics (doc-topic distribution). For one document, the doc-word distribution is the observed, sparse, low-level representation of the content, while the doc-topic distribution is the latent, dense, high-level one of the same content. Learning a topic model can then be viewed as a process of minimising the transportation of the semantic information from one distribution to the other. This new viewpoint leads to a novel OT-based topic modelling framework, which enjoys appealing simplicity, effectiveness, and efficiency. Extensive experiments show that our framework significantly outperforms several state-of-the-art models in terms of both topic quality and document representations.

READ FULL TEXT
research
06/16/2017

An Automatic Approach for Document-level Topic Model Evaluation

Topic models jointly learn topics and document-level topic distribution....
research
11/22/2021

HTMOT : Hierarchical Topic Modelling Over Time

Over the years, topic models have provided an efficient way of extractin...
research
09/19/2017

MetaLDA: a Topic Model that Efficiently Incorporates Meta information

Besides the text content, documents and their associated words usually c...
research
10/25/2021

Contrastive Learning for Neural Topic Model

Recent empirical studies show that adversarial topic models (ATM) can su...
research
11/04/2016

Generalized Topic Modeling

Recently there has been significant activity in developing algorithms wi...
research
06/14/2018

Learning Influence-Receptivity Network Structure with Guarantee

Traditional works on community detection from observations of informatio...
research
12/12/2017

Document Generation with Hierarchical Latent Tree Models

In most probabilistic topic models, a document is viewed as a collection...

Please sign up or login with your details

Forgot password? Click here to reset