DeepAI
Log In Sign Up

BERT Goes Shopping: Comparing Distributional Models for Product Representations

12/17/2020
by   Federico Bianchi, et al.
1

Word embeddings (e.g., word2vec) have been applied successfully to eCommerce products through prod2vec. Inspired by the recent performance improvements on several NLP tasks brought by contextualized embeddings, we propose to transfer BERT-like architectures to eCommerce: our model – ProdBERT – is trained to generate representations of products through masked session modeling. Through extensive experiments over multiple shops, different tasks, and a range of design choices, we systematically compare the accuracy of ProdBERT and prod2vec embeddings: while ProdBERT is found to be superior to traditional methods in several scenarios, we highlight the importance of resources and hyperparameters in the best performing models. Finally, we conclude by providing guidelines for training embeddings under a variety of computational and data constraints.

READ FULL TEXT

page 1

page 2

page 3

page 4

09/09/2019

Reverse Transfer Learning: Can Word Embeddings Trained for Different NLP Tasks Improve Neural Language Models?

Natural language processing (NLP) tasks tend to suffer from a paucity of...
04/02/2021

Query2Prod2Vec Grounded Word Embeddings for eCommerce

We present Query2Prod2Vec, a model that grounds lexical representations ...
01/25/2019

Word Embeddings: A Survey

This work lists and describes the main recent strategies for building fi...
11/09/2020

Catch the "Tails" of BERT

Recently, contextualized word embeddings outperform static word embeddin...
06/21/2022

NorBERT: NetwOrk Representations through BERT for Network Analysis and Management

Deep neural network models have been very successfully applied to Natura...
09/15/2020

Lessons Learned from Applying off-the-shelf BERT: There is no Silver Bullet

One of the challenges in the NLP field is training large classification ...