What did you Mention? A Large Scale Mention Detection Benchmark for Spoken and Written Text

01/23/2018
by   Yosi Mass, et al.
0

We describe a large, high-quality benchmark for the evaluation of Mention Detection tools. The benchmark contains annotations of both named entities as well as other types of entities, annotated on different types of text, ranging from clean text taken from Wikipedia, to noisy spoken data. The benchmark was built through a highly controlled crowd sourcing process to ensure its quality. We describe the benchmark, the process and the guidelines that were used to build it. We then demonstrate the results of a state-of-the-art system running on that benchmark.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/22/2020

QuerYD: A video dataset with high-quality textual and audio narrations

We introduce QuerYD, a new large-scale dataset for retrieval and event l...
research
11/27/2019

NorNE: Annotating Named Entities for Norwegian

This paper presents NorNE, a manually annotated corpus of named entities...
research
10/13/2022

Scaling Back-Translation with Domain Text Generation for Sign Language Gloss Translation

Sign language gloss translation aims to translate the sign glosses into ...
research
08/08/2022

MetaGraspNet: A Large-Scale Benchmark Dataset for Scene-Aware Ambidextrous Bin Picking via Physics-based Metaverse Synthesis

Autonomous bin picking poses significant challenges to vision-driven rob...
research
09/15/2021

WikiGUM: Exhaustive Entity Linking for Wikification in 12 Genres

Previous work on Entity Linking has focused on resources targeting non-n...
research
10/05/2019

Pi-PE: A Pipeline for Pulmonary Embolism Detection using Sparsely Annotated 3D CT Images

Pulmonary embolisms (PE) are known to be one of the leading causes for c...
research
10/12/2021

SportsSum2.0: Generating High-Quality Sports News from Live Text Commentary

Sports game summarization aims to generate news articles from live text ...

Please sign up or login with your details

Forgot password? Click here to reset