DeepAI AI Chat
Log In Sign Up

Entity-Aware Language Model as an Unsupervised Reranker

by   Mohammad Sadegh Rasooli, et al.
Columbia University

In language modeling, it is difficult to incorporate entity relationships from a knowledge-base. One solution is to use a reranker trained with global features, in which global features are derived from n-best lists. However, training such a reranker requires manually annotated n-best lists, which is expensive to obtain. We propose a method based on the contrastive estimation method smith2005contrastive that alleviates the need for such data. Experiments in the music domain demonstrate that global features, as well as features extracted from an external knowledge-base, can be incorporated into our reranker. Our final model achieves a 0.44 absolute word error rate improvement on the blind test data.


page 1

page 2

page 3

page 4


Calculating Question Similarity is Enough:A New Method for KBQA Tasks

Knowledge Base Question Answering (KBQA) aims to answer natural language...

A topic-aware graph neural network model for knowledge base updating

The open domain knowledge base is very important. It is usually extracte...

Computing Entity Semantic Similarity by Features Ranking

This article presents a novel approach to estimate semantic entity simil...

A Discriminative Entity-Aware Language Model for Virtual Assistants

High-quality automatic speech recognition (ASR) is essential for virtual...

Knowledge Enhanced Fine-Tuning for Better Handling Unseen Entities in Dialogue Generation

Although pre-training models have achieved great success in dialogue gen...

Supervised and Unsupervised Ensembling for Knowledge Base Population

We present results on combining supervised and unsupervised methods to e...

Multinomial Loss on Held-out Data for the Sparse Non-negative Matrix Language Model

We describe Sparse Non-negative Matrix (SNM) language model estimation u...