Integrating Multiplicative Features into Supervised Distributional Methods for Lexical Entailment

04/24/2018
by   Tu Vu, et al.
0

Supervised distributional methods are applied successfully in lexical entailment, but recent work questioned whether these methods actually learn a relation between two words. Specifically, Levy et al. (2015) claimed that linear classifiers learn only separate properties of each word. We suggest a cheap and easy way to boost the performance of these methods by integrating multiplicative features into commonly used representations. We provide an extensive evaluation with different classifiers and evaluation setups, and suggest a suitable evaluation setup for the task, eliminating biases existing in previous ones.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2015

Distributional Sentence Entailment Using Density Matrices

Categorical compositional distributional model of Coecke et al. (2010) s...
research
07/13/2016

A Vector Space for Distributional Semantics for Entailment

Distributional semantics creates vector-space representations that captu...
research
05/23/2018

Scoring Lexical Entailment with a Supervised Directional Similarity Network

We present the Supervised Directional Similarity Network (SDSN), a novel...
research
05/18/2016

Relations such as Hypernymy: Identifying and Exploiting Hearst Patterns in Distributional Vectors for Lexical Entailment

We consider the task of predicting lexical entailment using distribution...
research
12/02/2014

Tiered Clustering to Improve Lexical Entailment

Many tasks in Natural Language Processing involve recognizing lexical en...
research
08/06/2016

HyperLex: A Large-Scale Evaluation of Graded Lexical Entailment

We introduce HyperLex - a dataset and evaluation resource that quantifie...
research
10/02/2017

Unsupervised Hypernym Detection by Distributional Inclusion Vector Embedding

Modeling hypernymy, such as poodle is-a dog, is an important generalizat...

Please sign up or login with your details

Forgot password? Click here to reset