Reproducible scaling laws for contrastive language-image learning

12/14/2022
by   Mehdi Cherti, et al.
0

Scaling up neural networks has led to remarkable performance across a wide range of tasks. Moreover, performance often follows reliable scaling laws as a function of training set size, model size, and compute, which offers valuable guidance as large-scale experiments are becoming increasingly expensive. However, previous work on scaling laws has primarily used private data & models or focused on uni-modal language or vision learning. To address these limitations, we investigate scaling laws for contrastive language-image pre-training (CLIP) with the public LAION dataset and the open-source OpenCLIP repository. Our large-scale experiments involve models trained on up to two billion image-text pairs and identify power law scaling for multiple downstream tasks including zero-shot classification, retrieval, linear probing, and end-to-end fine-tuning. We find that the training distribution plays a key role in scaling laws as the OpenAI and OpenCLIP models exhibit different scaling behavior despite identical model architectures and similar training recipes. We open-source our evaluation workflow and all models, including the largest public CLIP models, to ensure reproducibility and make scaling laws research more accessible. Source code and instructions to reproduce this study will be available at https://github.com/LAION-AI/scaling-laws-openclip

READ FULL TEXT

page 22

page 39

research
10/26/2022

Broken Neural Scaling Laws

We present a smoothly broken power law functional form that accurately m...
research
10/13/2021

Scaling Laws for the Few-Shot Adaptation of Pre-trained Image Classifiers

Empirical science of neural scaling laws is a rapidly growing area of si...
research
05/22/2023

Getting ViT in Shape: Scaling Laws for Compute-Optimal Model Design

Scaling laws have been recently employed to derive compute-optimal model...
research
09/13/2022

Revisiting Neural Scaling Laws in Language and Vision

The remarkable progress in deep learning in recent years is largely driv...
research
05/11/2023

An Inverse Scaling Law for CLIP Training

CLIP, the first foundation model that connects images and text, has enab...
research
12/23/2022

The choice of scaling technique matters for classification performance

Dataset scaling, also known as normalization, is an essential preprocess...
research
06/22/2023

On Hate Scaling Laws For Data-Swamps

`Scale the model, scale the data, scale the GPU-farms' is the reigning s...

Please sign up or login with your details

Forgot password? Click here to reset