M-SENSE: Modeling Narrative Structure in Short Personal Narratives Using Protagonist's Mental Representations

02/18/2023
by   Prashanth Vijayaraghavan, et al.
0

Narrative is a ubiquitous component of human communication. Understanding its structure plays a critical role in a wide variety of applications, ranging from simple comparative analyses to enhanced narrative retrieval, comprehension, or reasoning capabilities. Prior research in narratology has highlighted the importance of studying the links between cognitive and linguistic aspects of narratives for effective comprehension. This interdependence is related to the textual semantics and mental language in narratives, referring to characters' motivations, feelings or emotions, and beliefs. However, this interdependence is hardly explored for modeling narratives. In this work, we propose the task of automatically detecting prominent elements of the narrative structure by analyzing the role of characters' inferred mental state along with linguistic information at the syntactic and semantic levels. We introduce a STORIES dataset of short personal narratives containing manual annotations of key elements of narrative structure, specifically climax and resolution. To this end, we implement a computational model that leverages the protagonist's mental state information obtained from a pre-trained model trained on social commonsense knowledge and integrates their representations with contextual semantic embed-dings using a multi-feature fusion approach. Evaluating against prior zero-shot and supervised baselines, we find that our model is able to achieve significant improvements in the task of identifying climax and resolution.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/09/2016

Towards an Indexical Model of Situated Language Comprehension for Cognitive Agents in Physical Worlds

We propose a computational model of situated language comprehension base...
research
04/03/2022

Learning Disentangled Semantic Representations for Zero-Shot Cross-Lingual Transfer in Multilingual Machine Reading Comprehension

Multilingual pre-trained models are able to zero-shot transfer knowledge...
research
01/30/2023

Thought Bubbles: A Proxy into Players' Mental Model Development

Studying mental models has recently received more attention, aiming to u...
research
04/14/2021

Modeling Human Mental States with an Entity-based Narrative Graph

Understanding narrative text requires capturing characters' motivations,...
research
12/16/2021

DREAM: Uncovering Mental Models behind Language Models

To what extent do language models (LMs) build "mental models" of a scene...
research
05/16/2018

Modeling Naive Psychology of Characters in Simple Commonsense Stories

Understanding a narrative requires reading between the lines and reasoni...
research
04/26/2023

Systems Modeling for novice engineers to comprehend software products better

One of the key challenges for a novice engineer in a product company is ...

Please sign up or login with your details

Forgot password? Click here to reset