Automatic Metaphor Interpretation Using Word Embeddings

10/06/2020
by   Kfir Bar, et al.
0

We suggest a model for metaphor interpretation using word embeddings trained over a relatively large corpus. Our system handles nominal metaphors, like "time is money". It generates a ranked list of potential interpretations of given metaphors. Candidate meanings are drawn from collocations of the topic ("time") and vehicle ("money") components, automatically extracted from a dependency-parsed corpus. We explore adding candidates derived from word association norms (common human responses to cues). Our ranking procedure considers similarity between candidate interpretations and metaphor components, measured in a semantic vector space. Lastly, a clustering algorithm removes semantically related duplicates, thereby allowing other candidate interpretations to attain higher rank. We evaluate using a set of annotated metaphors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/05/2021

Denoising Word Embeddings by Averaging in a Shared Space

We introduce a new approach for smoothing and improving the quality of w...
research
09/05/2017

Language Modeling by Clustering with Word Embeddings for Text Readability Assessment

We present a clustering-based language model using word embeddings for t...
research
05/04/2018

A Rank-Based Similarity Metric for Word Embeddings

Word Embeddings have recently imposed themselves as a standard for repre...
research
05/30/2014

Semantic Composition and Decomposition: From Recognition to Generation

Semantic composition is the task of understanding the meaning of text by...
research
08/14/2018

Embedding Grammars

Classic grammars and regular expressions can be used for a variety of pu...
research
05/29/2020

SLAM-Inspired Simultaneous Contextualization and Interpreting for Incremental Conversation Sentences

Distributed representation of words has improved the performance for man...

Please sign up or login with your details

Forgot password? Click here to reset