Machine Learning Model Sizes and the Parameter Gap

07/05/2022
by   Pablo Villalobos, et al.
47

We study trends in model size of notable machine learning systems over time using a curated dataset. From 1950 to 2018, model size in language models increased steadily by seven orders of magnitude. The trend then accelerated, with model size increasing by another five orders of magnitude in just 4 years from 2018 to 2022. Vision models grew at a more constant pace, totaling 7 orders of magnitude of growth between 1950 and 2022. We also identify that, since 2020, there have been many language models below 20B parameters, many models above 70B parameters, but a scarcity of models in the 20-70B parameter range. We refer to that scarcity as the parameter gap. We provide some stylized facts about the parameter gap and propose a few hypotheses to explain it. The explanations we favor are: (a) increasing model size beyond 20B parameters requires adopting different parallelism techniques, which makes mid-sized models less cost-effective, (b) GPT-3 was one order of magnitude larger than previous language models, and researchers afterwards primarily experimented with bigger models to outperform it. While these dynamics likely exist, and we believe they play some role in generating the gap, we don't have high confidence that there are no other, more important dynamics at play.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/15/2020

It's Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners

When scaled to hundreds of billions of parameters, pretrained language m...
research
02/02/2022

Unified Scaling Laws for Routed Language Models

The performance of a language model has been shown to be effectively mod...
research
01/23/2020

Scaling Laws for Neural Language Models

We study empirical scaling laws for language model performance on the cr...
research
10/26/2020

Word Frequency Does Not Predict Grammatical Knowledge in Language Models

Neural language models learn, to varying degrees of accuracy, the gramma...
research
07/25/2017

Speeding-up ProbLog's Parameter Learning

ProbLog is a state-of-art combination of logic programming and probabili...
research
01/11/2021

A Bayesian neural network predicts the dissolution of compact planetary systems

Despite over three hundred years of effort, no solutions exist for predi...
research
05/09/2023

FrugalGPT: How to Use Large Language Models While Reducing Cost and Improving Performance

There is a rapidly growing number of large language models (LLMs) that u...

Please sign up or login with your details

Forgot password? Click here to reset