DeepAI AI Chat
Log In Sign Up

Evaluating Pretrained Transformer Models for Entity Linking in Task-Oriented Dialog

by   Sai Muralidhar Jayanthi, et al.

The wide applicability of pretrained transformer models (PTMs) for natural language tasks is well demonstrated, but their ability to comprehend short phrases of text is less explored. To this end, we evaluate different PTMs from the lens of unsupervised Entity Linking in task-oriented dialog across 5 characteristics – syntactic, semantic, short-forms, numeric and phonetic. Our results demonstrate that several of the PTMs produce sub-par results when compared to traditional techniques, albeit competitive to other neural baselines. We find that some of their shortcomings can be addressed by using PTMs fine-tuned for text-similarity tasks, which illustrate an improved ability in comprehending semantic and syntactic correspondences, as well as some improvements for short-forms, numeric and phonetic variations in entity mentions. We perform qualitative analysis to understand nuances in their predictions and discuss scope for further improvements. Code can be found at


page 1

page 2

page 3

page 4


A Knowledge-Grounded Dialog System Based on Pre-Trained Language Models

We present a knowledge-grounded dialog system developed for the ninth Di...

Learn to Not Link: Exploring NIL Prediction in Entity Linking

Entity linking models have achieved significant success via utilizing pr...

Text Classification Models for Form Entity Linking

Forms are a widespread type of template-based document used in a great v...

SUMBT+LaRL: End-to-end Neural Task-oriented Dialog System with Reinforcement Learning

The recent advent of neural approaches for developing each dialog compon...

Neural Entity Linking on Technical Service Tickets

Entity linking, the task of mapping textual mentions to known entities, ...

CHOLAN: A Modular Approach for Neural Entity Linking on Wikipedia and Wikidata

In this paper, we propose CHOLAN, a modular approach to target end-to-en...

Eliciting Latent Predictions from Transformers with the Tuned Lens

We analyze transformers from the perspective of iterative inference, see...