Sparse associative memory based on contextual code learning for disambiguating word senses

11/14/2019
by   Max Raphael Sobroza, et al.
0

In recent literature, contextual pretrained Language Models (LMs) demonstrated their potential in generalizing the knowledge to several Natural Language Processing (NLP) tasks including supervised Word Sense Disambiguation (WSD), a challenging problem in the field of Natural Language Understanding (NLU). However, word representations from these models are still very dense, costly in terms of memory footprint, as well as minimally interpretable. In order to address such issues, we propose a new supervised biologically inspired technique for transferring large pre-trained language model representations into a compressed representation, for the case of WSD. Our produced representation contributes to increase the general interpretability of the framework and to decrease memory footprint, while enhancing performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/22/2021

Low Anisotropy Sense Retrofitting (LASeR) : Towards Isotropic and Sense Enriched Representations

Contextual word representation models have shown massive improvements on...
research
04/14/2021

Distributed Word Representation in Tsetlin Machine

Tsetlin Machine (TM) is an interpretable pattern recognition algorithm b...
research
06/28/2019

Supervised Contextual Embeddings for Transfer Learning in Natural Language Processing Tasks

Pre-trained word embeddings are the primary method for transfer learning...
research
01/11/2023

SensePOLAR: Word sense aware interpretability for pre-trained contextual word embeddings

Adding interpretability to word embeddings represents an area of active ...
research
06/05/2023

Enhancing Language Representation with Constructional Information for Natural Language Understanding

Natural language understanding (NLU) is an essential branch of natural l...
research
12/27/2021

Understanding RoBERTa's Mood: The Role of Contextual-Embeddings as User-Representations for Depression Prediction

Many works in natural language processing have shown connections between...
research
06/04/2019

Transferable Neural Projection Representations

Neural word representations are at the core of many state-of-the-art nat...

Please sign up or login with your details

Forgot password? Click here to reset