Understanding Scaling Laws for Recommendation Models

08/17/2022
by   Newsha Ardalani, et al.
0

Scale has been a major driving force in improving machine learning performance, and understanding scaling laws is essential for strategic planning for a sustainable model quality performance growth, long-term resource planning and developing efficient system infrastructures to support large-scale models. In this paper, we study empirical scaling laws for DLRM style recommendation models, in particular Click-Through Rate (CTR). We observe that model quality scales with power law plus constant in model size, data size and amount of compute used for training. We characterize scaling efficiency along three different resource dimensions, namely data, parameters and compute by comparing the different scaling schemes along these axes. We show that parameter scaling is out of steam for the model architecture under study, and until a higher-performing model architecture emerges, data scaling is the path forward. The key research questions addressed by this study include: Does a recommendation model scale sustainably as predicted by the scaling laws? Or are we far off from the scaling law predictions? What are the limits of scaling? What are the implications of the scaling laws on long-term hardware/system development?

READ FULL TEXT
research
02/13/2022

Scaling Laws Under the Microscope: Predicting Transformer Performance from Small Scale Experiments

Neural scaling laws define a predictable relationship between a model's ...
research
06/29/2022

Beyond neural scaling laws: beating power law scaling via data pruning

Widely observed neural scaling laws, in which error falls off as a power...
research
09/15/2023

Scaling Laws for Sparsely-Connected Foundation Models

We explore the impact of parameter sparsity on the scaling behavior of T...
research
01/06/2023

Myths and Legends in High-Performance Computing

In this humorous and thought provoking article, we discuss certain myths...
research
07/05/2023

Scaling Laws Do Not Scale

Recent work has proposed a power law relationship, referred to as “scali...
research
02/14/2023

Cliff-Learning

We study the data-scaling of transfer learning from foundation models in...
research
09/20/2023

The Languini Kitchen: Enabling Language Modelling Research at Different Scales of Compute

The Languini Kitchen serves as both a research collective and codebase d...

Please sign up or login with your details

Forgot password? Click here to reset