BERT Rediscovers the Classical NLP Pipeline

05/15/2019
by   Ian Tenney, et al.
0

Pre-trained text encoders have rapidly advanced the state of the art on many NLP tasks. We focus on one such model, BERT, and aim to quantify where linguistic information is captured within the network. We find that the model represents the steps of the traditional NLP pipeline in an interpretable and localizable way, and that the regions responsible for each step appear in the expected sequence: POS tagging, parsing, NER, semantic roles, then coreference. Qualitative analysis reveals that the model can and often does adjust this pipeline dynamically, revising lower-level decisions on the basis of disambiguating information from higher-level representations.

READ FULL TEXT

page 3

page 4

page 7

page 8

research
04/14/2020

What's so special about BERT's layers? A closer look at the NLP pipeline in monolingual and multilingual models

Experiments with transfer learning on pre-trained language models such a...
research
05/31/2021

How transfer learning impacts linguistic knowledge in deep NLP models?

Transfer learning from pre-trained neural language models towards downst...
research
07/14/2021

Large-Scale News Classification using BERT Language Model: Spark NLP Approach

The rise of big data analytics on top of NLP increases the computational...
research
08/01/2017

Improving Part-of-Speech Tagging for NLP Pipelines

This paper outlines the results of sentence level linguistics based rule...
research
11/09/2020

VisBERT: Hidden-State Visualizations for Transformers

Explainability and interpretability are two important concepts, the abse...
research
05/26/2020

Comparing BERT against traditional machine learning text classification

The BERT model has arisen as a popular state-of-the-art machine learning...
research
12/03/2021

Augmenting Customer Support with an NLP-based Receptionist

In this paper, we show how a Portuguese BERT model can be combined with ...

Please sign up or login with your details

Forgot password? Click here to reset