Just-DREAM-about-it: Figurative Language Understanding with DREAM-FLUTE

10/28/2022
by   Yuling Gu, et al.
0

Figurative language (e.g., "he flew like the wind") is challenging to understand, as it is hard to tell what implicit information is being conveyed from the surface form alone. We hypothesize that to perform this task well, the reader needs to mentally elaborate the scene being described to identify a sensible meaning of the language. We present DREAM-FLUTE, a figurative language understanding system that does this, first forming a "mental model" of situations described in a premise and hypothesis before making an entailment/contradiction decision and generating an explanation. DREAM-FLUTE uses an existing scene elaboration model, DREAM, for constructing its "mental model." In the FigLang2022 Shared Task evaluation, DREAM-FLUTE achieved (joint) first place (Acc@60=63.3 techniques, demonstrating the effectiveness of this approach. More generally, this work suggests that adding a reflective component to pretrained language models can improve their performance beyond standard fine-tuning (3.3 improvement in Acc@60).

READ FULL TEXT
research
11/02/2020

Adapting Pretrained Transformer to Lattices for Spoken Language Understanding

Lattices are compact representations that encode multiple hypotheses, su...
research
05/26/2023

Entailment as Robust Self-Learner

Entailment has been recognized as an important metric for evaluating nat...
research
09/15/2020

It's Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners

When scaled to hundreds of billions of parameters, pretrained language m...
research
10/29/2021

MentalBERT: Publicly Available Pretrained Language Models for Mental Healthcare

Mental health is a critical issue in modern society, and mental disorder...
research
12/12/2019

Extending Machine Language Models toward Human-Level Language Understanding

Language is central to human intelligence. We review recent breakthrough...
research
08/15/2021

Accurate, yet inconsistent? Consistency Analysis on Language Understanding Models

Consistency, which refers to the capability of generating the same predi...
research
07/14/2022

Beware the Rationalization Trap! When Language Model Explainability Diverges from our Mental Models of Language

Language models learn and represent language differently than humans; th...

Please sign up or login with your details

Forgot password? Click here to reset