Inducing Syntactic Trees from BERT Representations

06/27/2019
by   Rudolf Rosa, et al.
0

We use the English model of BERT and explore how a deletion of one word in a sentence changes representations of other words. Our hypothesis is that removing a reducible word (e.g. an adjective) does not affect the representation of other words so much as removing e.g. the main verb, which makes the sentence ungrammatical and of "high surprise" for the language model. We estimate reducibilities of individual words and also of longer continuous phrases (word n-grams), study their syntax-related properties, and then also use them to induce full dependency trees.

READ FULL TEXT
research
05/18/2023

Trading Syntax Trees for Wordpieces: Target-oriented Opinion Words Extraction with Wordpieces and Aspect Enhancement

State-of-the-art target-oriented opinion word extraction (TOWE) models t...
research
09/23/2021

Putting Words in BERT's Mouth: Navigating Contextualized Vector Spaces with Pseudowords

We present a method for exploring regions around individual points in a ...
research
07/21/2020

Word Representation for Rhythms

This paper proposes a word representation strategy for rhythm patterns. ...
research
11/19/2015

Good, Better, Best: Choosing Word Embedding Context

We propose two methods of learning vector representations of words and p...
research
06/20/2022

SynWMD: Syntax-aware Word Mover's Distance for Sentence Similarity Evaluation

Word Mover's Distance (WMD) computes the distance between words and mode...
research
12/09/2020

Cross-lingual Word Sense Disambiguation using mBERT Embeddings with Syntactic Dependencies

Cross-lingual word sense disambiguation (WSD) tackles the challenge of d...
research
10/29/2020

Contextual BERT: Conditioning the Language Model Using a Global State

BERT is a popular language model whose main pre-training task is to fill...

Please sign up or login with your details

Forgot password? Click here to reset