Improving sentence compression by learning to predict gaze

04/12/2016
by   Sigrid Klerke, et al.
0

We show how eye-tracking corpora can be used to improve sentence compression models, presenting a novel multi-task learning algorithm based on multi-layer LSTMs. We obtain performance competitive with or better than state-of-the-art approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2019

Human acceptability judgements for extractive sentence compression

Recent approaches to English-language sentence compression rely on paral...
research
10/15/2020

Improving Natural Language Processing Tasks with Human Gaze-Guided Neural Attention

A lack of corpora has so far limited advances in integrating human gaze ...
research
05/25/2020

Happy Are Those Who Grade without Seeing: A Multi-Task Learning Approach to Grade Essays Using Gaze Behaviour

The gaze behaviour of a reader is helpful in solving several NLP tasks s...
research
06/19/2018

Dynamic Multi-Level Multi-Task Learning for Sentence Simplification

Sentence simplification aims to improve readability and understandabilit...
research
11/27/2018

Kernel-based Multi-Task Contextual Bandits in Cellular Network Configuration

Cellular network configuration plays a critical role in network performa...
research
10/26/2018

Integrating Transformer and Paraphrase Rules for Sentence Simplification

Sentence simplification aims to reduce the complexity of a sentence whil...
research
09/27/2021

Multi-Task and Multi-Corpora Training Strategies to Enhance Argumentative Sentence Linking Performance

Argumentative structure prediction aims to establish links between textu...

Please sign up or login with your details

Forgot password? Click here to reset