What do you learn from context? Probing for sentence structure in contextualized word representations

05/15/2019
by   Ian Tenney, et al.
0

Contextualized representation models such as ELMo (Peters et al., 2018a) and BERT (Devlin et al., 2018) have recently achieved state-of-the-art results on a diverse array of downstream NLP tasks. Building on recent token-level probing work, we introduce a novel edge probing task design and construct a broad suite of sub-sentence tasks derived from the traditional structured NLP pipeline. We probe word-level contextual representations from four recent models and investigate how they encode sentence structure across a range of syntactic, semantic, local, and long-range phenomena. We find that existing models trained on language modeling and translation produce strong representations for syntactic phenomena, but only offer comparably small improvements on semantic tasks over a non-contextual baseline.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/11/2019

FlauBERT: Unsupervised Language Model Pre-training for French

Language models have become a key step to achieve state-of-the-art resul...
research
06/15/2019

Context is Key: Grammatical Error Detection with Contextual Word Representations

Grammatical error detection (GED) in non-native writing requires systems...
research
01/11/2019

Grammatical Analysis of Pretrained Sentence Encoders with Acceptability Judgments

Recent pretrained sentence encoders achieve state of the art results on ...
research
12/11/2019

BERT has a Moral Compass: Improvements of ethical and moral values of machines

Allowing machines to choose whether to kill humans would be devastating ...
research
06/09/2019

Encouraging Paragraph Embeddings to Remember Sentence Identity Improves Classification

While paragraph embedding models are remarkably effective for downstream...
research
05/21/2021

A Non-Linear Structural Probe

Probes are models devised to investigate the encoding of knowledge – e.g...
research
05/24/2022

Learning for Expressive Task-Related Sentence Representations

NLP models learn sentence representations for downstream tasks by tuning...

Please sign up or login with your details

Forgot password? Click here to reset