DeepAI AI Chat
Log In Sign Up

Contextual Temperature for Language Modeling

by   Pei-Hsin Wang, et al.

Temperature scaling has been widely used as an effective approach to control the smoothness of a distribution, which helps the model performance in various tasks. Current practices to apply temperature scaling assume either a fixed, or a manually-crafted dynamically changing schedule. However, our studies indicate that the individual optimal trajectory for each class can change with the context. To this end, we propose contextual temperature, a generalized approach that learns an optimal temperature trajectory for each vocabulary over the context. Experimental results confirm that the proposed method significantly improves state-of-the-art language models, achieving a perplexity of 55.31 and 62.89 on the test set of Penn Treebank and WikiText-2, respectively. In-depth analyses show that the behaviour of the learned temperature schedules varies dramatically by vocabulary, and that the optimal schedules help in controlling the uncertainties. These evidences further justify the need for the proposed method and its advantages over fixed temperature schedules.


page 1

page 2

page 3

page 4


Long Horizon Temperature Scaling

Temperature scaling is a popular technique for tuning the sharpness of a...

Layer-Stack Temperature Scaling

Recent works demonstrate that early layers in a neural network contain u...

Fine-tune your Classifier: Finding Correlations With Temperature

Temperature is a widely used hyperparameter in various tasks involving n...

Unigram-Normalized Perplexity as a Language Model Performance Measure with Different Vocabulary Sizes

Although Perplexity is a widely used performance metric for language mod...


The TempTracker is a device that allows the user to track the temperatur...

Constrained Optimization with Dynamic Bound-scaling for Effective NLPBackdoor Defense

We develop a novel optimization method for NLPbackdoor inversion. We lev...