Negative Samples are at Large: Leveraging Hard-distance Elastic Loss for Re-identification

07/20/2022
by   Hyungtae Lee, et al.
0

We present a Momentum Re-identification (MoReID) framework that can leverage a very large number of negative samples in training for general re-identification task. The design of this framework is inspired by Momentum Contrast (MoCo), which uses a dictionary to store current and past batches to build a large set of encoded samples. As we find it less effective to use past positive samples which may be highly inconsistent to the encoded feature property formed with the current positive samples, MoReID is designed to use only a large number of negative samples stored in the dictionary. However, if we train the model using the widely used Triplet loss that uses only one sample to represent a set of positive/negative samples, it is hard to effectively leverage the enlarged set of negative samples acquired by the MoReID framework. To maximize the advantage of using the scaled-up negative sample set, we newly introduce Hard-distance Elastic loss (HE loss), which is capable of using more than one hard sample to represent a large number of samples. Our experiments demonstrate that a large number of negative samples provided by MoReID framework can be utilized at full capacity only with the HE loss, achieving the state-of-the-art accuracy on three re-ID benchmarks, VeRi-776, Market-1501, and VeRi-Wild.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/14/2021

Hard Samples Rectification for Unsupervised Cross-domain Person Re-identification

Person re-identification (re-ID) has received great success with the sup...
research
08/08/2023

Your Negative May not Be True Negative: Boosting Image-Text Matching with False Negative Elimination

Most existing image-text matching methods adopt triplet loss as the opti...
research
03/10/2023

Self-supervised Training Sample Difficulty Balancing for Local Descriptor Learning

In the case of an imbalance between positive and negative samples, hard ...
research
03/30/2022

Dual Temperature Helps Contrastive Learning Without Many Negative Samples: Towards Understanding and Simplifying MoCo

Contrastive learning (CL) is widely known to require many negative sampl...
research
08/18/2018

Support Neighbor Loss for Person Re-Identification

Person re-identification (re-ID) has recently been tremendously boosted ...
research
06/01/2020

Global Distance-distributions Separation for Unsupervised Person Re-identification

Supervised person re-identification (ReID) often has poor scalability an...
research
08/06/2019

Bag of Negatives for Siamese Architectures

Training a Siamese architecture for re-identification with a large numbe...

Please sign up or login with your details

Forgot password? Click here to reset