Enhanced Word Representations for Bridging Anaphora Resolution

03/13/2018
by   Yufang Hou, et al.
0

Most current models of word representations(e.g.,GloVe) have successfully captured fine-grained semantics. However, semantic similarity exhibited in these word embeddings is not suitable for resolving bridging anaphora, which requires the knowledge of associative similarity (i.e., relatedness) instead of semantic similarity information between synonyms or hypernyms. We create word embeddings (embeddings_PP) to capture such relatedness by exploring the syntactic structure of noun phrases. We demonstrate that using embeddings_PP alone achieves around 30 ISNotes corpus. Furthermore, we achieve a substantial gain over the state-of-the-art system (Hou et al., 2013) for bridging antecedent selection.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2018

A Deterministic Algorithm for Bridging Anaphora Resolution

Previous work on bridging anaphora resolution (Poesio et al., 2004; Hou ...
research
04/06/2016

An Ensemble Method to Produce High-Quality Word Embeddings

A currently successful approach to computational semantics is to represe...
research
03/14/2022

Contrastive Visual Semantic Pretraining Magnifies the Semantics of Natural Language Representations

We examine the effects of contrastive visual semantic pretraining by com...
research
04/05/2019

Exploring Fine-Tuned Embeddings that Model Intensifiers for Emotion Analysis

Adjective phrases like "a little bit surprised", "completely shocked", o...
research
11/27/2018

Verb Argument Structure Alternations in Word and Sentence Embeddings

Verbs occur in different syntactic environments, or frames. We investiga...
research
09/05/2018

Firearms and Tigers are Dangerous, Kitchen Knives and Zebras are Not: Testing whether Word Embeddings Can Tell

This paper presents an approach for investigating the nature of semantic...

Please sign up or login with your details

Forgot password? Click here to reset