Neural Metaphor Detection in Context

08/29/2018
by   Ge Gao, et al.
0

We present end-to-end neural models for detecting metaphorical word use in context. We show that relatively standard BiLSTM models which operate on complete sentences work well in this setting, in comparison to previous work that used more restricted forms of linguistic context. These models establish a new state-of-the-art on existing verb metaphor detection benchmarks, and show strong performance on jointly predicting the metaphoricity of all words in a running text.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/01/2019

Bad Form: Comparing Context-Based and Form-Based Few-Shot Learning in Distributional Semantic Models

Word embeddings are an essential component in a wide range of natural la...
research
04/24/2017

Fast and Accurate Neural Word Segmentation for Chinese

Neural models with minimal feature engineering have achieved competitive...
research
11/16/2022

Neural Unsupervised Reconstruction of Protolanguage Word Forms

We present a state-of-the-art neural approach to the unsupervised recons...
research
12/06/2021

Impact of Target Word and Context on End-to-End Metonymy Detection

Metonymy is a figure of speech in which an entity is referred to by anot...
research
11/06/2018

Fast Neural Chinese Word Segmentation for Long Sentences

Rapidly developed neural models have achieved competitive performance in...
research
12/05/2017

No Need for a Lexicon? Evaluating the Value of the Pronunciation Lexica in End-to-End Models

For decades, context-dependent phonemes have been the dominant sub-word ...
research
01/23/2019

Automated Essay Scoring based on Two-Stage Learning

Current state-of-art feature-engineered and end-to-end Automated Essay S...

Please sign up or login with your details

Forgot password? Click here to reset