Query Expansion with Locally-Trained Word Embeddings

05/25/2016
by   Fernando Diaz, et al.
0

Continuous space word embeddings have received a great deal of attention in the natural language processing and machine learning communities for their ability to model term similarity and other relationships. We study the use of term relatedness in the context of query expansion for ad hoc information retrieval. We demonstrate that word embeddings such as word2vec and GloVe, when trained globally, underperform corpus and query specific embeddings for retrieval tasks. These results suggest that other tasks benefiting from global embeddings may also benefit from local embeddings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/26/2021

A data-driven strategy to combine word embeddings in information retrieval

Word embeddings are vital descriptors of words in unigram representation...
research
06/22/2016

Toward Word Embedding for Personalized Information Retrieval

This paper presents preliminary works on using Word Embedding (word2vec)...
research
11/08/2018

Deep Neural Networks for Query Expansion using Word Embeddings

Query expansion is a method for alleviating the vocabulary mismatch prob...
research
09/04/2019

Affect Enriched Word Embeddings for News Information Retrieval

Distributed representations of words have shown to be useful to improve ...
research
11/17/2019

Quels corpus d'entraînement pour l'expansion de requêtes par plongement de mots : application à la recherche de microblogs culturels

We describe here an experimental framework and the results obtained on m...
research
01/12/2022

Diagnosing BERT with Retrieval Heuristics

Word embeddings, made widely popular in 2013 with the release of word2ve...
research
01/14/2020

Humpty Dumpty: Controlling Word Meanings via Corpus Poisoning

Word embeddings, i.e., low-dimensional vector representations such as Gl...

Please sign up or login with your details

Forgot password? Click here to reset