Zero-training Sentence Embedding via Orthogonal Basis

09/30/2018
by   ZiYi Yang, et al.
0

We propose a simple and robust training-free approach for building sentence representations. Inspired by the Gram-Schmidt Process in geometric theory, we build an orthogonal basis of the subspace spanned by a word and its surrounding context in a sentence. We model the semantic meaning of a word in a sentence based on two aspects. One is its relatedness to the word vector subspace already spanned by its contextual words. The other is the word's novel semantic meaning which shall be introduced as a new basis vector perpendicular to this existing subspace. Following this motivation, we develop an innovative method based on orthogonal basis to combine pre-trained word embeddings into sentence representations. This approach requires zero training and zero parameters, along with efficient inference performance. We evaluate our approach on 11 downstream NLP tasks. Experimental results show that our model outperforms all existing zero-training alternatives in all the tasks and it is competitive to other approaches relying on either large amounts of labelled data or prolonged training time.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/22/2020

Efficient Sentence Embedding via Semantic Subspace Analysis

A novel sentence embedding method built upon semantic subspace analysis,...
research
09/12/2019

Retrofitting Contextualized Word Embeddings with Paraphrases

Contextualized word embedding models, such as ELMo, generate meaningful ...
research
02/25/2020

Semantic Relatedness for Keyword Disambiguation: Exploiting Different Embeddings

Understanding the meaning of words is crucial for many tasks that involv...
research
02/24/2016

Ultradense Word Embeddings by Orthogonal Transformation

Embeddings are generic representations that are useful for many NLP task...
research
07/23/2022

Context based lemmatizer for Polish language

Lemmatization is the process of grouping together the inflected forms of...
research
03/07/2018

An efficient framework for learning sentence representations

In this work we propose a simple and efficient framework for learning se...
research
09/06/2023

ContrastWSD: Enhancing Metaphor Detection with Word Sense Disambiguation Following the Metaphor Identification Procedure

This paper presents ContrastWSD, a RoBERTa-based metaphor detection mode...

Please sign up or login with your details

Forgot password? Click here to reset