ImageNet-21K Pretraining for the Masses

04/22/2021
by   Tal Ridnik, et al.
0

ImageNet-1K serves as the primary dataset for pretraining deep learning models for computer vision tasks. ImageNet-21K dataset, which contains more pictures and classes, is used less frequently for pretraining, mainly due to its complexity, and underestimation of its added value compared to standard ImageNet-1K pretraining. This paper aims to close this gap, and make high-quality efficient pretraining on ImageNet-21K available for everyone. Via a dedicated preprocessing stage, utilizing WordNet hierarchies, and a novel training scheme called semantic softmax, we show that various models, including small mobile-oriented models, significantly benefit from ImageNet-21K pretraining on numerous datasets and tasks. We also show that we outperform previous ImageNet-21K pretraining schemes for prominent new models like ViT. Our proposed pretraining pipeline is efficient, accessible, and leads to SoTA reproducible results, from a publicly available dataset. The training code and pretrained models are available at: https://github.com/Alibaba-MIIL/ImageNet21K

READ FULL TEXT

page 2

page 3

research
04/02/2023

Video Pretraining Advances 3D Deep Learning on Chest CT Tasks

Pretraining on large natural image classification datasets such as Image...
research
07/15/2020

Comparing to Learn: Surpassing ImageNet Pretraining on Radiographs By Comparing Image Representations

In deep learning era, pretrained models play an important role in medica...
research
06/26/2023

ParameterNet: Parameters Are All You Need for Large-scale Visual Pretraining of Mobile Networks

The large-scale visual pretraining has significantly improve the perform...
research
04/07/2022

Solving ImageNet: a Unified Scheme for Training any Backbone to Top Results

ImageNet serves as the primary dataset for evaluating the quality of com...
research
03/07/2023

Can We Scale Transformers to Predict Parameters of Diverse ImageNet Models?

Pretraining a neural network on a large dataset is becoming a cornerston...
research
08/21/2023

When Prompt-based Incremental Learning Does Not Meet Strong Pretraining

Incremental learning aims to overcome catastrophic forgetting when learn...
research
08/25/2023

PAITS: Pretraining and Augmentation for Irregularly-Sampled Time Series

Real-world time series data that commonly reflect sequential human behav...

Please sign up or login with your details

Forgot password? Click here to reset