SimDETR: Simplifying self-supervised pretraining for DETR

07/28/2023
by   Ioannis Maniadis Metaxas, et al.
0

DETR-based object detectors have achieved remarkable performance but are sample-inefficient and exhibit slow convergence. Unsupervised pretraining has been found to be helpful to alleviate these impediments, allowing training with large amounts of unlabeled data to improve the detector's performance. However, existing methods have their own limitations, like keeping the detector's backbone frozen in order to avoid performance degradation and utilizing pretraining objectives misaligned with the downstream task. To overcome these limitations, we propose a simple pretraining framework for DETR-based detectors that consists of three simple yet key ingredients: (i) richer, semantics-based initial proposals derived from high-level feature maps, (ii) discriminative training using object pseudo-labels produced via clustering, (iii) self-training to take advantage of the improved object proposals learned by the detector. We report two main findings: (1) Our pretraining outperforms prior DETR pretraining works on both the full and low data regimes by significant margins. (2) We show we can pretrain DETR from scratch (including the backbone) directly on complex image datasets like COCO, paving the path for unsupervised representation learning directly using DETR.

READ FULL TEXT
research
09/21/2023

DEYOv3: DETR with YOLO for Real-time Object Detection

Recently, end-to-end object detectors have gained significant attention ...
research
08/08/2022

Label-Free Synthetic Pretraining of Object Detectors

We propose a new approach, Synthetic Optimized Layout with Instance Dete...
research
11/07/2022

Group DETR v2: Strong Object Detector with Encoder-Decoder Pretraining

We present a strong object detector with encoder-decoder pretraining and...
research
06/08/2021

DETReg: Unsupervised Pretraining with Region Priors for Object Detection

Unsupervised pretraining has recently proven beneficial for computer vis...
research
04/11/2023

A surprisingly simple technique to control the pretraining bias for better transfer: Expand or Narrow your representation

Self-Supervised Learning (SSL) models rely on a pretext task to learn re...
research
03/11/2017

Colorization as a Proxy Task for Visual Understanding

We investigate and improve self-supervision as a drop-in replacement for...
research
10/26/2021

TUNet: A Block-online Bandwidth Extension Model based on Transformers and Self-supervised Pretraining

We introduce a block-online variant of the temporal feature-wise linear ...

Please sign up or login with your details

Forgot password? Click here to reset