Efficient Sampled Softmax for Tensorflow

04/10/2020
by   Maciej Skorski, et al.
0

This short paper discusses an efficient implementation of sampled softmax loss for Tensorflow. The speedup over the default implementation is achieved due to simplification of the graph for the forward and backward passes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/24/2019

Sampled Softmax with Random Fourier Features

The computational cost of training with softmax cross entropy loss grows...
research
01/07/2022

On the Effectiveness of Sampled Softmax Loss for Item Recommendation

Learning objectives of recommender models remain largely unexplored. Mos...
research
03/31/2022

Memory-Efficient Training of RNN-Transducer with Sampled Softmax

RNN-Transducer has been one of promising architectures for end-to-end au...
research
04/28/2019

Softmax Optimizations for Intel Xeon Processor-based Platforms

Softmax is popular normalization method used in machine learning. Deep l...
research
09/08/2017

TensorFlow Agents: Efficient Batched Reinforcement Learning in TensorFlow

We introduce TensorFlow Agents, an efficient infrastructure paradigm for...
research
08/12/2017

Noisy Softmax: Improving the Generalization Ability of DCNN via Postponing the Early Softmax Saturation

Over the past few years, softmax and SGD have become a commonly used com...
research
12/31/2020

A Constant-time Adaptive Negative Sampling

Softmax classifiers with a very large number of classes naturally occur ...

Please sign up or login with your details

Forgot password? Click here to reset