Deep Neural Networks Evolve Human-like Attention Distribution during Reading Comprehension

07/13/2021
by   Jiajie Zou, et al.
0

Attention is a key mechanism for information selection in both biological brains and many state-of-the-art deep neural networks (DNNs). Here, we investigate whether humans and DNNs allocate attention in comparable ways when reading a text passage to subsequently answer a specific question. We analyze 3 transformer-based DNNs that reach human-level performance when trained to perform the reading comprehension task. We find that the DNN attention distribution quantitatively resembles human attention distribution measured by fixation times. Human readers fixate longer on words that are more relevant to the question-answering task, demonstrating that attention is modulated by top-down reading goals, on top of lower-level visual and text features of the stimulus. Further analyses reveal that the attention weights in DNNs are also influenced by both top-down reading goals and lower-level stimulus features, with the shallow layers more strongly influenced by lower-level text features and the deep layers attending more to task-relevant words. Additionally, deep layers' attention to task-relevant words gradually emerges when pre-trained DNN models are fine-tuned to perform the reading comprehension task, which coincides with the improvement in task performance. These results demonstrate that DNNs can evolve human-like attention distribution through task optimization, which suggests that human attention during goal-directed reading comprehension is a consequence of task optimization.

READ FULL TEXT
research
09/30/2020

Bridging Information-Seeking Human Gaze and Machine Reading Comprehension

In this work, we analyze how human gaze during reading comprehension is ...
research
08/27/2018

Comparing Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension

We propose a machine reading comprehension model based on the compare-ag...
research
09/08/2018

Generating Distractors for Reading Comprehension Questions from Real Examinations

We investigate the task of distractor generation for multiple choice rea...
research
11/06/2016

Words or Characters? Fine-grained Gating for Reading Comprehension

Previous work combines word-level and character-level representations us...
research
10/13/2020

Interpreting Attention Models with Human Visual Attention in Machine Reading Comprehension

While neural networks with attention mechanisms have achieved superior p...
research
05/05/2017

Sequential Attention: A Context-Aware Alignment Function for Machine Reading

In this paper we propose a neural network model with a novel Sequential ...
research
08/26/2021

Understanding Attention in Machine Reading Comprehension

Achieving human-level performance on some of Machine Reading Comprehensi...

Please sign up or login with your details

Forgot password? Click here to reset