Use Generalized Representations, But Do Not Forget Surface Features

02/24/2017
by   Nafise Sadat Moosavi, et al.
0

Only a year ago, all state-of-the-art coreference resolvers were using an extensive amount of surface features. Recently, there was a paradigm shift towards using word embeddings and deep neural networks, where the use of surface features is very limited. In this paper, we show that a simple SVM model with surface features outperforms more complex neural models for detecting anaphoric mentions. Our analysis suggests that using generalized representations and surface features have different strength that should be both taken into account for improving coreference resolution.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/23/2017

SPINE: SParse Interpretable Neural Embeddings

Prediction without justification has limited utility. Much of the succes...
research
10/16/2019

BERTRAM: Improved Word Embeddings Have Big Impact on Contextualized Model Performance

Pretraining deep contextualized representations using an unsupervised la...
research
06/05/2021

Denoising Word Embeddings by Averaging in a Shared Space

We introduce a new approach for smoothing and improving the quality of w...
research
06/15/2017

A Mixture Model for Learning Multi-Sense Word Embeddings

Word embeddings are now a standard technique for inducing meaning repres...
research
01/13/2020

Visual Storytelling via Predicting Anchor Word Embeddings in the Stories

We propose a learning model for the task of visual storytelling. The mai...
research
05/31/2019

The Pupil Has Become the Master: Teacher-Student Model-Based Word Embedding Distillation with Ensemble Learning

Recent advances in deep learning have facilitated the demand of neural m...
research
10/24/2019

Assisting human experts in the interpretation of their visual process: A case study on assessing copper surface adhesive potency

Deep Neural Networks are often though to lack interpretability due to th...

Please sign up or login with your details

Forgot password? Click here to reset