One-vs-Each Approximation to Softmax for Scalable Estimation of Probabilities

09/23/2016
by   Michalis K. Titsias, et al.
0

The softmax representation of probabilities for categorical variables plays a prominent role in modern machine learning with numerous applications in areas such as large scale classification, neural language modeling and recommendation systems. However, softmax estimation is very expensive for large scale inference because of the high cost associated with computing the normalizing constant. Here, we introduce an efficient approximation to softmax probabilities which takes the form of a rigorous lower bound on the exact probability. This bound is expressed as a product over pairwise probabilities and it leads to scalable estimation based on stochastic optimization. It allows us to perform doubly stochastic estimation by subsampling both training instances and class labels. We show that the new bound has interesting theoretical properties and we demonstrate its use in classification problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2018

Augment and Reduce: Stochastic Inference for Large Categorical Distributions

Categorical distributions are ubiquitous in machine learning, e.g., in c...
research
12/13/2018

Effectiveness of Hierarchical Softmax in Large Scale Classification Tasks

Typically, Softmax is used in the final layer of a neural network to get...
research
12/21/2017

DropMax: Adaptive Stochastic Softmax

We propose DropMax, a stochastic version of softmax classifier which at ...
research
01/30/2019

Doubly Sparse: Sparse Mixture of Sparse Experts for Efficient Softmax Inference

Computations for the softmax function are significantly expensive when t...
research
06/11/2018

Navigating with Graph Representations for Fast and Scalable Decoding of Neural Language Models

Neural language models (NLMs) have recently gained a renewed interest by...
research
06/03/2018

Data-Free/Data-Sparse Softmax Parameter Estimation with Structured Class Geometries

This note considers softmax parameter estimation when little/no labeled ...
research
02/15/2020

Extreme Classification via Adversarial Softmax Approximation

Training a classifier over a large number of classes, known as 'extreme ...

Please sign up or login with your details

Forgot password? Click here to reset