DeepAI AI Chat
Log In Sign Up

Entity-Aware Language Model as an Unsupervised Reranker

03/12/2018
by   Mohammad Sadegh Rasooli, et al.
Microsoft
Columbia University
0

In language modeling, it is difficult to incorporate entity relationships from a knowledge-base. One solution is to use a reranker trained with global features, in which global features are derived from n-best lists. However, training such a reranker requires manually annotated n-best lists, which is expensive to obtain. We propose a method based on the contrastive estimation method smith2005contrastive that alleviates the need for such data. Experiments in the music domain demonstrate that global features, as well as features extracted from an external knowledge-base, can be incorporated into our reranker. Our final model achieves a 0.44 absolute word error rate improvement on the blind test data.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/15/2021

Calculating Question Similarity is Enough:A New Method for KBQA Tasks

Knowledge Base Question Answering (KBQA) aims to answer natural language...
08/31/2022

A topic-aware graph neural network model for knowledge base updating

The open domain knowledge base is very important. It is usually extracte...
11/06/2018

Computing Entity Semantic Similarity by Features Ranking

This article presents a novel approach to estimate semantic entity simil...
06/21/2021

A Discriminative Entity-Aware Language Model for Virtual Assistants

High-quality automatic speech recognition (ASR) is essential for virtual...
09/12/2021

Knowledge Enhanced Fine-Tuning for Better Handling Unseen Entities in Dialogue Generation

Although pre-training models have achieved great success in dialogue gen...
04/16/2016

Supervised and Unsupervised Ensembling for Knowledge Base Population

We present results on combining supervised and unsupervised methods to e...
11/05/2015

Multinomial Loss on Held-out Data for the Sparse Non-negative Matrix Language Model

We describe Sparse Non-negative Matrix (SNM) language model estimation u...