Examining the rhetorical capacities of neural language models

10/01/2020
by   Zining Zhu, et al.
0

Recently, neural language models (LMs) have demonstrated impressive abilities in generating high-quality discourse. While many recent papers have analyzed the syntactic aspects encoded in LMs, there has been no analysis to date of the inter-sentential, rhetorical knowledge. In this paper, we propose a method that quantitatively evaluates the rhetorical capacities of neural LMs. We examine the capacities of neural LMs understanding the rhetoric of discourse by evaluating their abilities to encode a set of linguistic features derived from Rhetorical Structure Theory (RST). Our experiments show that BERT-based LMs outperform other Transformer LMs, revealing the richer discourse knowledge in their intermediate layer representations. In addition, GPT-2 and XLNet apparently encode less rhetorical knowledge, and we suggest an explanation drawing from linguistic philosophy. Our method shows an avenue towards quantifying the rhetorical capacities of neural LMs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/10/2020

Discourse structure interacts with reference but not syntax in neural language models

Language models (LMs) trained on large quantities of text have been clai...
research
05/07/2021

Is Incoherence Surprising? Targeted Evaluation of Coherence Prediction from Language Models

Coherent discourse is distinguished from a mere collection of utterances...
research
09/10/2021

Augmenting BERT-style Models with Predictive Coding to Improve Discourse-level Representations

Current language models are usually trained using a self-supervised sche...
research
05/06/2022

When a sentence does not introduce a discourse entity, Transformer-based models still sometimes refer to it

Understanding longer narratives or participating in conversations requir...
research
04/22/2020

Syntactic Structure from Deep Learning

Modern deep neural networks achieve impressive performance in engineerin...
research
10/08/2021

Text analysis and deep learning: A network approach

Much information available to applied researchers is contained within wr...
research
04/15/2018

Context and Humor: Understanding Amul advertisements of India

Contextual knowledge is the most important element in understanding lang...

Please sign up or login with your details

Forgot password? Click here to reset