Unsupervised Word and Dependency Path Embeddings for Aspect Term Extraction

05/25/2016
by   Yichun Yin, et al.
0

In this paper, we develop a novel approach to aspect term extraction based on unsupervised learning of distributed representations of words and dependency paths. The basic idea is to connect two words (w1 and w2) with the dependency path (r) between them in the embedding space. Specifically, our method optimizes the objective w1 + r = w2 in the low-dimensional space, where the multi-hop dependency paths are treated as a sequence of grammatical relations and modeled by a recurrent neural network. Then, we design the embedding features that consider linear context and dependency context information, for the conditional random field (CRF) based aspect term extraction. Experimental results on the SemEval datasets show that, (1) with only embedding features, we can achieve state-of-the-art results; (2) our embedding method which incorporates the syntactic information among words yields better performance than other representative ones in aspect term extraction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2019

PoD: Positional Dependency-Based Word Embedding for Aspect Term Extraction

Dependency context-based word embedding jointly learns the representatio...
research
05/21/2018

Improving Aspect Term Extraction with Bidirectional Dependency Tree Representation

Aspect term extraction is one of the important subtasks in aspect-based ...
research
09/11/2019

Comprehensive Analysis of Aspect Term Extraction Methods using Various Text Embeddings

Recently, a variety of model designs and methods have blossomed in the c...
research
09/10/2018

Filling Missing Paths: Modeling Co-occurrences of Word Pairs and Dependency Paths for Recognizing Lexical Semantic Relations

Recognizing lexical semantic relations between word pairs is an importan...
research
05/06/2020

Unsupervised Neural Aspect Search with Related Terms Extraction

The tasks of aspect identification and term extraction remain challengin...
research
11/19/2015

Good, Better, Best: Choosing Word Embedding Context

We propose two methods of learning vector representations of words and p...
research
07/07/2019

Neural Aspect and Opinion Term Extraction with Mined Rules as Weak Supervision

Lack of labeled training data is a major bottleneck for neural network b...

Please sign up or login with your details

Forgot password? Click here to reset