Tail Batch Sampling: Approximating Global Contrastive Losses as Optimization over Batch Assignments

10/23/2022
by   Vin Sachidananda, et al.
0

Contrastive Learning has recently achieved state-of-the-art performance in a wide range of tasks. Many contrastive learning approaches use mined hard negatives to make batches more informative during training but these approaches are inefficient as they increase epoch length proportional to the number of mined negatives and require frequent updates of nearest neighbor indices or mining from recent batches. In this work, we provide an alternative to hard negative mining in supervised contrastive learning, Tail Batch Sampling (TBS), an efficient approximation to the batch assignment problem that upper bounds the gap between the global and training losses, ℒ^Global - ℒ^Train. TBS improves state-of-the-art performance in sentence embedding (+0.37 Spearman) and code-search tasks (+2.2% MRR), is easy to implement - requiring only a few additional lines of code, does not maintain external data structures such as nearest neighbor indices, is more computationally efficient when compared to the most minimal hard negative mining approaches, and makes no changes to the model being trained.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset