BatchFormer: Learning to Explore Sample Relationships for Robust Representation Learning

03/03/2022
by   Zhi Hou, et al.
11

Despite the success of deep neural networks, there are still many challenges in deep representation learning due to the data scarcity issues such as data imbalance, unseen distribution, and domain shift. To address the above-mentioned issues, a variety of methods have been devised to explore the sample relationships in a vanilla way (i.e., from the perspectives of either the input or the loss function), failing to explore the internal structure of deep neural networks for learning with sample relationships. Inspired by this, we propose to enable deep neural networks themselves with the ability to learn the sample relationships from each mini-batch. Specifically, we introduce a batch transformer module or BatchFormer, which is then applied into the batch dimension of each mini-batch to implicitly explore sample relationships during training. By doing this, the proposed method enables the collaboration of different samples, e.g., the head-class samples can also contribute to the learning of the tail classes for long-tailed recognition. Furthermore, to mitigate the gap between training and testing, we share the classifier between with or without the BatchFormer during training, which can thus be removed during testing. We perform extensive experiments on over ten datasets and the proposed method achieves significant improvements on different data scarcity applications without any bells and whistles, including the tasks of long-tailed recognition, compositional zero-shot learning, domain generalization, and contrastive learning. Code will be made publicly available at <https://github.com/zhihou7/BatchFormer>

READ FULL TEXT

page 1

page 8

page 14

page 15

page 16

page 17

research
04/04/2022

BatchFormerV2: Exploring Sample Relationships for Dense Representation Learning

Attention mechanisms have been very popular in deep neural networks, whe...
research
08/09/2021

Unified Regularity Measures for Sample-wise Learning and Generalization

Fundamental machine learning theory shows that different samples contrib...
research
05/25/2022

Contrastive Learning with Boosted Memorization

Self-supervised learning has achieved a great success in the representat...
research
03/22/2022

Rebalanced Siamese Contrastive Mining for Long-Tailed Recognition

Deep neural networks perform poorly on heavily class-imbalanced datasets...
research
11/19/2022

Rethinking Batch Sample Relationships for Data Representation: A Batch-Graph Transformer based Approach

Exploring sample relationships within each mini-batch has shown great po...
research
11/10/2021

Feature Generation for Long-tail Classification

The visual world naturally exhibits an imbalance in the number of object...
research
08/02/2023

Can We Transfer Noise Patterns? An Multi-environment Spectrum Analysis Model Using Generated Cases

Spectrum analysis systems in online water quality testing are designed t...

Please sign up or login with your details

Forgot password? Click here to reset