What does BERT learn about prosody?

04/25/2023
by   Sofoklis Kakouros, et al.
0

Language models have become nearly ubiquitous in natural language processing applications achieving state-of-the-art results in many tasks including prosody. As the model design does not define predetermined linguistic targets during training but rather aims at learning generalized representations of the language, analyzing and interpreting the representations that models implicitly capture is important in bridging the gap between interpretability and model performance. Several studies have explored the linguistic information that models capture providing some insights on their representational capacity. However, the current studies have not explored whether prosody is part of the structural information of the language that models learn. In this work, we perform a series of experiments on BERT probing the representations captured at different layers. Our results show that information about prosodic prominence spans across many layers but is mostly focused in middle layers suggesting that BERT relies mostly on syntactic and semantic information.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/26/2019

Does BERT agree? Evaluating knowledge of structure dependence through agreement relations

Learning representations that accurately model semantics is an important...
research
11/09/2020

CxGBERT: BERT meets Construction Grammar

While lexico-semantic elements no doubt capture a large amount of lingui...
research
04/03/2021

Exploring the Role of BERT Token Representations to Explain Sentence Probing Results

Several studies have been carried out on revealing linguistic features c...
research
07/20/2022

Integrating Linguistic Theory and Neural Language Models

Transformer-based language models have recently achieved remarkable resu...
research
05/16/2021

How is BERT surprised? Layerwise detection of linguistic anomalies

Transformer language models have shown remarkable ability in detecting w...
research
09/18/2020

Will it Unblend?

Natural language processing systems often struggle with out-of-vocabular...
research
05/28/2020

On Incorporating Structural Information to improve Dialogue Response Generation

We consider the task of generating dialogue responses from background kn...

Please sign up or login with your details

Forgot password? Click here to reset