Semeval-2022 Task 1: CODWOE – Comparing Dictionaries and Word Embeddings

05/27/2022
by   Timothee Mickus, et al.
0

Word embeddings have advanced the state of the art in NLP across numerous tasks. Understanding the contents of dense neural representations is of utmost interest to the computational semantics community. We propose to focus on relating these opaque word vectors with human-readable definitions, as found in dictionaries. This problem naturally divides into two subtasks: converting definitions into embeddings, and converting embeddings into definitions. This task was conducted in a multilingual setting, using comparable sets of embeddings trained homogeneously.

READ FULL TEXT

page 7

page 8

research
06/15/2016

Learning Word Sense Embeddings from Word Sense Definitions

Word embeddings play a significant role in many modern NLP systems. Sinc...
research
10/03/2019

Complex networks based word embeddings

Most of the time, the first step to learn word embeddings is to build a ...
research
01/23/2021

Dictionary-based Debiasing of Pre-trained Word Embeddings

Word embeddings trained on large corpora have shown to encode high level...
research
04/21/2018

Context-Attentive Embeddings for Improved Sentence Representations

While one of the first steps in many NLP systems is selecting what embed...
research
05/12/2023

Multi-Relational Hyperbolic Word Embeddings from Natural Language Definitions

Neural-based word embeddings using solely distributional information hav...
research
01/28/2019

Analogies Explained: Towards Understanding Word Embeddings

Word embeddings generated by neural network methods such as word2vec (W2...
research
11/04/2016

Automated Generation of Multilingual Clusters for the Evaluation of Distributed Representations

We propose a language-agnostic way of automatically generating sets of s...

Please sign up or login with your details

Forgot password? Click here to reset