Incremental Processing in the Age of Non-Incremental Encoders: An Empirical Assessment of Bidirectional Models for Incremental NLU

10/11/2020
by   Brielen Madureira, et al.
0

While humans process language incrementally, the best language encoders currently used in NLP do not. Both bidirectional LSTMs and Transformers assume that the sequence that is to be encoded is available in full, to be processed either forwards and backwards (BiLSTMs) or as a whole (Transformers). We investigate how they behave under incremental interfaces, when partial output must be provided based on partial input seen up to a certain time step, which may happen in interactive systems. We test five models on various NLU datasets and compare their performance using three incremental evaluation metrics. The results support the possibility of using bidirectional encoders in incremental mode while retaining most of their non-incremental quality. The "omni-directional" BERT model, which achieves better non-incremental performance, is impacted more by the incremental access. This can be alleviated by adapting the training regime (truncated training), or the testing procedure, by delaying the output until some right context is available or by incorporating hypothetical right contexts generated by a language model like GPT-2.

READ FULL TEXT

page 2

page 5

page 8

research
09/15/2021

Towards Incremental Transformers: An Empirical Analysis of Transformer Models for Incremental NLU

Incremental processing allows interactive systems to respond based on pa...
research
05/18/2023

TAPIR: Learning Adaptive Revision for Incremental Natural Language Understanding with a Two-Pass Model

Language is by its very nature incremental in how it is produced and pro...
research
10/08/2020

Masked ELMo: An evolution of ELMo towards fully contextual RNN language models

This paper presents Masked ELMo, a new RNN-based model for language mode...
research
09/30/2018

An Incremental Iterated Response Model of Pragmatics

Recent Iterated Response (IR) models of pragmatics conceptualize languag...
research
05/28/2014

Conformant Planning as a Case Study of Incremental QBF Solving

We consider planning with uncertainty in the initial state as a case stu...
research
10/13/2017

A Deep Incremental Boltzmann Machine for Modeling Context in Robots

Context is an essential capability for robots that are to be as adaptive...

Please sign up or login with your details

Forgot password? Click here to reset