Fast End-to-End Wikification

08/19/2019
by   Ilya Shnayderman, et al.
0

Wikification of large corpora is beneficial for various NLP applications. Existing methods focus on quality performance rather than run-time, and are therefore non-feasible for large data. Here, we introduce RedW, a run-time oriented Wikification solution, based on Wikipedia redirects, that can Wikify massive corpora with competitive performance. We further propose an efficient method for estimating RedW confidence, opening the door for applying more demanding methods only on top of RedW lower-confidence results. Our experimental results support the validity of the proposed approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/31/2023

Fused Depthwise Tiling for Memory Optimization in TinyML Deep Neural Network Inference

Memory optimization for deep neural network (DNN) inference gains high r...
research
01/30/2013

Evaluating Las Vegas Algorithms - Pitfalls and Remedies

Stochastic search algorithms are among the most sucessful approaches for...
research
10/06/2020

Rank and run-time aware compression of NLP Applications

Sequence model based NLP applications can be large. Yet, many applicatio...
research
04/11/2023

Partitioner Selection with EASE to Optimize Distributed Graph Processing

For distributed graph processing on massive graphs, a graph is partition...
research
05/14/2023

NLP-based Cross-Layer 5G Vulnerabilities Detection via Fuzzing Generated Run-Time Profiling

The effectiveness and efficiency of 5G software stack vulnerability and ...
research
03/09/2023

Real-Time Adaptive Abstraction and Approximation Using Validity Frames – an Experience Report

Designing a Cyber-Physical System (CPS), including modeling the control ...
research
08/20/2020

Recorp: Receiver-Oriented Policies for Industrial Wireless Networks

Future Industrial Internet-of-Things (IIoT) systems will require wireles...

Please sign up or login with your details

Forgot password? Click here to reset