IRB-NLP at SemEval-2022 Task 1: Exploring the Relationship Between Words and Their Semantic Representations

05/13/2022
by   Damir Korenčić, et al.
0

What is the relation between a word and its description, or a word and its embedding? Both descriptions and embeddings are semantic representations of words. But, what information from the original word remains in these representations? Or more importantly, which information about a word do these two representations share? Definition Modeling and Reverse Dictionary are two opposite learning tasks that address these questions. The goal of the Definition Modeling task is to investigate the power of information laying inside a word embedding to express the meaning of the word in a humanly understandable way – as a dictionary definition. Conversely, the Reverse Dictionary task explores the ability to predict word embeddings directly from its definition. In this paper, by tackling these two tasks, we are exploring the relationship between words and their semantic representations. We present our findings based on the descriptive, exploratory, and predictive data analysis conducted on the CODWOE dataset. We give a detailed overview of the systems that we designed for Definition Modeling and Reverse Dictionary tasks, and that achieved top scores on SemEval-2022 CODWOE challenge in several subtasks. We hope that our experimental results concerning the predictive models and the data analyses we provide will prove useful in future explorations of word representations and their relationships.

READ FULL TEXT

page 14

page 15

page 19

page 20

page 21

page 22

page 23

research
12/01/2016

Definition Modeling: Learning to define word embeddings in natural language

Distributed representations of words have been shown to capture lexical ...
research
05/09/2022

A Unified Model for Reverse Dictionary and Definition Modelling

We train a dual-way neural dictionary to guess words from definitions (r...
research
09/05/2017

Using k-way Co-occurrences for Learning Word Embeddings

Co-occurrences between two words provide useful insights into the semant...
research
10/09/2019

Word Embedding Visualization Via Dictionary Learning

Co-occurrence statistics based word embedding techniques have proved to ...
research
05/20/2017

Mixed Membership Word Embeddings for Computational Social Science

Word embeddings improve the performance of NLP systems by revealing the ...
research
09/10/2019

Definition Frames: Using Definitions for Hybrid Concept Representations

Concept representations is a particularly active area in NLP. Although r...
research
03/24/2018

Equation Embeddings

We present an unsupervised approach for discovering semantic representat...

Please sign up or login with your details

Forgot password? Click here to reset