Empirical Error Modeling Improves Robustness of Noisy Neural Sequence Labeling

05/25/2021
by   Marcin Namysl, et al.
12

Despite recent advances, standard sequence labeling systems often fail when processing noisy user-generated text or consuming the output of an Optical Character Recognition (OCR) process. In this paper, we improve the noise-aware training method by proposing an empirical error generation approach that employs a sequence-to-sequence model trained to perform translation from error-free to erroneous text. Using an OCR engine, we generated a large parallel text corpus for training and produced several real-world noisy sequence labeling benchmarks for evaluation. Moreover, to overcome the data sparsity problem that exacerbates in the case of imperfect textual input, we learned noisy language model-based embeddings. Our approach outperformed the baseline noise generation and error correction techniques on the erroneous sequence labeling data sets. To facilitate future research on robustness, we make our code, embeddings, and data conversion scripts publicly available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/14/2020

NAT: Noise-Aware Training for Robust Neural Sequence Labeling

Sequence labeling systems should perform reliably not only under ideal c...
research
05/29/2021

Grammatical Error Correction as GAN-like Sequence Labeling

In Grammatical Error Correction (GEC), sequence labeling models enjoy fa...
research
03/12/2021

Improving Translation Robustness with Visual Cues and Error Correction

Neural Machine Translation models are brittle to input noise. Current ro...
research
01/02/2023

Transformer Based Geocoding

In this paper, we formulate the problem of predicting a geolocation from...
research
05/25/2020

From the Paft to the Fiiture: a Fully Automatic NMT and Word Embeddings Method for OCR Post-Correction

A great deal of historical corpora suffer from errors introduced by the ...
research
10/12/2020

Controlled Hallucinations: Learning to Generate Faithfully from Noisy Data

Neural text generation (data- or text-to-text) demonstrates remarkable p...
research
08/28/2018

Why Do Neural Response Generation Models Prefer Universal Replies?

Recent advances in sequence-to-sequence learning reveal a purely data-dr...

Please sign up or login with your details

Forgot password? Click here to reset