Good, Better, Best: Choosing Word Embedding Context

11/19/2015
by   James Cross, et al.
0

We propose two methods of learning vector representations of words and phrases that each combine sentence context with structural features extracted from dependency trees. Using several variations of neural network classifier, we show that these combined methods lead to improved performance when used as input features for supervised term-matching.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2019

PoD: Positional Dependency-Based Word Embedding for Aspect Term Extraction

Dependency context-based word embedding jointly learns the representatio...
research
10/26/2021

Task-Specific Dependency-based Word Embedding Methods

Two task-specific dependency-based word embedding methods are proposed f...
research
06/27/2019

Inducing Syntactic Trees from BERT Representations

We use the English model of BERT and explore how a deletion of one word ...
research
05/26/2018

SJTU-NLP at SemEval-2018 Task 9: Neural Hypernym Discovery with Term Embeddings

This paper describes a hypernym discovery system for our participation i...
research
08/14/2018

Embedding Grammars

Classic grammars and regular expressions can be used for a variety of pu...
research
05/25/2016

Unsupervised Word and Dependency Path Embeddings for Aspect Term Extraction

In this paper, we develop a novel approach to aspect term extraction bas...
research
11/22/2019

Topical Phrase Extraction from Clinical Reports by Incorporating both Local and Global Context

Making sense of words often requires to simultaneously examine the surro...

Please sign up or login with your details

Forgot password? Click here to reset