Meta-Embeddings for Natural Language Inference and Semantic Similarity tasks

12/01/2020
by   Shree Charran R, et al.
0

Word Representations form the core component for almost all advanced Natural Language Processing (NLP) applications such as text mining, question-answering, and text summarization, etc. Over the last two decades, immense research is conducted to come up with one single model to solve all major NLP tasks. The major problem currently is that there are a plethora of choices for different NLP tasks. Thus for NLP practitioners, the task of choosing the right model to be used itself becomes a challenge. Thus combining multiple pre-trained word embeddings and forming meta embeddings has become a viable approach to improve tackle NLP tasks. Meta embedding learning is a process of producing a single word embedding from a given set of pre-trained input word embeddings. In this paper, we propose to use Meta Embedding derived from few State-of-the-Art (SOTA) models to efficiently tackle mainstream NLP tasks like classification, semantic relatedness, and text similarity. We have compared both ensemble and dynamic variants to identify an efficient approach. The results obtained show that even the best State-of-the-Art models can be bettered. Thus showing us that meta-embeddings can be used for several NLP tasks by harnessing the power of several individual representations.

READ FULL TEXT

page 7

page 9

research
09/19/2017

Think Globally, Embed Locally --- Locally Linear Meta-embedding of Words

Distributed word embeddings have shown superior performances in numerous...
research
10/28/2020

A Comprehensive Survey on Word Representation Models: From Classical to State-Of-The-Art Word Representation Language Models

Word representation has always been an important research area in the hi...
research
08/18/2015

Learning Meta-Embeddings by Using Ensembles of Embedding Sets

Word embeddings -- distributed representations of words -- in deep learn...
research
06/21/2021

Membership Inference on Word Embedding and Beyond

In the text processing context, most ML models are built on word embeddi...
research
11/16/2018

Investigating the Effects of Word Substitution Errors on Sentence Embeddings

A key initial step in several natural language processing (NLP) tasks in...
research
11/04/2019

Spherical Text Embedding

Unsupervised text embedding has shown great power in a wide range of NLP...
research
05/24/2023

You Are What You Annotate: Towards Better Models through Annotator Representations

Annotator disagreement is ubiquitous in natural language processing (NLP...

Please sign up or login with your details

Forgot password? Click here to reset