Relaxed Softmax for learning from Positive and Unlabeled data

09/17/2019
by   Ugo Tanielian, et al.
39

In recent years, the softmax model and its fast approximations have become the de-facto loss functions for deep neural networks when dealing with multi-class prediction. This loss has been extended to language modeling and recommendation, two fields that fall into the framework of learning from Positive and Unlabeled data. In this paper, we stress the different drawbacks of the current family of softmax losses and sampling schemes when applied in a Positive and Unlabeled learning setup. We propose both a Relaxed Softmax loss (RS) and a new negative sampling scheme based on Boltzmann formulation. We show that the new training objective is better suited for the tasks of density estimation, item similarity and next-event prediction by driving uplifts in performance on textual and recommendation datasets against classical softmax.

READ FULL TEXT

page 5

page 6

page 7

research
11/16/2015

An Exploration of Softmax Alternatives Belonging to the Spherical Loss Family

In a multi-class classification problem, it is standard to model the out...
research
03/04/2020

On the Learning Property of Logistic and Softmax Losses for Deep Neural Networks

Deep convolutional neural networks (CNNs) trained with logistic and soft...
research
07/26/2017

Self-organized Hierarchical Softmax

We propose a new self-organizing hierarchical softmax formulation for ne...
research
08/04/2019

Softmax Dissection: Towards Understanding Intra- and Inter-clas Objective for Embedding Learning

The softmax loss and its variants are widely used as objectives for embe...
research
12/01/2015

Loss Functions for Top-k Error: Analysis and Insights

In order to push the performance on realistic computer vision tasks, the...
research
05/22/2018

Adversarial Training of Word2Vec for Basket Completion

In recent years, the Word2Vec model trained with the Negative Sampling l...

Please sign up or login with your details

Forgot password? Click here to reset