Word2Vec applied to Recommendation: Hyperparameters Matter

04/11/2018
by   Hugo Caselles-Dupré, et al.
0

Skip-gram with negative sampling, a popular variant of Word2vec originally designed and tuned to create word embeddings for Natural Language Processing, has been used to create item embeddings with successful applications in recommendation. While these fields do not share the same type of data, neither evaluate on the same tasks, recommendation applications tend to use the same already tuned hyperparameters values, even if optimal hyperparameters values are often known to be data and task dependent. We thus investigate the marginal importance of each hyperparameter in a recommendation setting, with an extensive joint hyperparameter optimization on various datasets. Results reveal that optimizing neglected hyperparameters, namely negative sampling distribution, number of epochs, subsampling parameter and window-size, significantly improves performance on a recommendation task, and can increase it up to a factor of 10.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/26/2018

Stochastic Hyperparameter Optimization through Hypernetworks

Machine learning models are often tuned by nesting optimization of model...
research
09/24/2020

Tuning Word2vec for Large Scale Recommendation Systems

Word2vec is a powerful machine learning tool that emerged from Natural L...
research
04/16/2021

Word2rate: training and evaluating multiple word embeddings as statistical transitions

Using pretrained word embeddings has been shown to be a very effective w...
research
05/21/2020

HyperSTAR: Task-Aware Hyperparameters for Deep Networks

While deep neural networks excel in solving visual recognition tasks, th...
research
07/21/2017

Optimal Hyperparameters for Deep LSTM-Networks for Sequence Labeling Tasks

Selecting optimal parameters for a neural network architecture can often...
research
09/27/2016

Optimizing Neural Network Hyperparameters with Gaussian Processes for Dialog Act Classification

Systems based on artificial neural networks (ANNs) have achieved state-o...
research
08/10/2023

Revisiting N-CNN for Clinical Practice

This paper revisits the Neonatal Convolutional Neural Network (N-CNN) by...

Please sign up or login with your details

Forgot password? Click here to reset