Log In Sign Up

Query2Prod2Vec Grounded Word Embeddings for eCommerce

by   Federico Bianchi, et al.

We present Query2Prod2Vec, a model that grounds lexical representations for product search in product embeddings: in our model, meaning is a mapping between words and a latent space of products in a digital shop. We leverage shopping sessions to learn the underlying space and use merchandising annotations to build lexical analogies for evaluation: our experiments show that our model is more accurate than known techniques from the NLP and IR literature. Finally, we stress the importance of data efficiency for product search outside of retail giants, and highlight how Query2Prod2Vec fits with practical constraints faced by most practitioners.


page 1

page 2

page 3

page 4


Tailoring Word Embeddings for Bilexical Predictions: An Experimental Comparison

We investigate the problem of inducing word embeddings that are tailored...

Learning Latent Vector Spaces for Product Search

We introduce a novel latent vector space model that jointly learns the l...

BERT Goes Shopping: Comparing Distributional Models for Product Representations

Word embeddings (e.g., word2vec) have been applied successfully to eComm...

Enhanced word embeddings using multi-semantic representation through lexical chains

The relationship between words in a sentence often tells us more about t...

Integrating Form and Meaning: A Multi-Task Learning Model for Acoustic Word Embeddings

Models of acoustic word embeddings (AWEs) learn to map variable-length s...

On Extending NLP Techniques from the Categorical to the Latent Space: KL Divergence, Zipf's Law, and Similarity Search

Despite the recent successes of deep learning in natural language proces...

Scalable bundling via dense product embeddings

Bundling, the practice of jointly selling two or more products at a disc...