Adversarial Training of Word2Vec for Basket Completion

05/22/2018
by   Ugo Tanielian, et al.
0

In recent years, the Word2Vec model trained with the Negative Sampling loss function has shown state-of-the-art results in a number of machine learning tasks, including language modeling tasks, such as word analogy and word similarity, and in recommendation tasks, through Prod2Vec, an extension that applies to modeling user shopping activity and user preferences. Several methods that aim to improve upon the standard Negative Sampling loss have been proposed. In our paper we pursue more sophisticated Negative Sampling, by leveraging ideas from the field of Generative Adversarial Networks (GANs), and propose Adversarial Negative Sampling. We build upon the recent progress made in stabilizing the training objective of GANs in the discrete data setting, and introduce a new GAN-Word2Vec model.We evaluate our model on the task of basket completion, and show significant improvements in performance over Word2Vec trained using standard loss functions, including Noise Contrastive Estimation and Negative Sampling.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/21/2021

An Empirical Study on GANs with Margin Cosine Loss and Relativistic Discriminator

Generative Adversarial Networks (GANs) have emerged as useful generative...
research
10/26/2022

SCP-GAN: Self-Correcting Discriminator Optimization for Training Consistency Preserving Metric GAN on Speech Enhancement Tasks

In recent years, Generative Adversarial Networks (GANs) have produced si...
research
10/21/2019

Improving Word Representations: A Sub-sampled Unigram Distribution for Negative Sampling

Word2Vec is the most popular model for word representation and has been ...
research
04/16/2019

IAN: Combining Generative Adversarial Networks for Imaginative Face Generation

Generative Adversarial Networks (GANs) have gained momentum for their ab...
research
11/13/2017

ACtuAL: Actor-Critic Under Adversarial Learning

Generative Adversarial Networks (GANs) are a powerful framework for deep...
research
06/03/2022

Contrastive learning unifies t-SNE and UMAP

Neighbor embedding methods t-SNE and UMAP are the de facto standard for ...
research
09/17/2019

Relaxed Softmax for learning from Positive and Unlabeled data

In recent years, the softmax model and its fast approximations have beco...

Please sign up or login with your details

Forgot password? Click here to reset