A Simplistic Model of Neural Scaling Laws: Multiperiodic Santa Fe Processes

02/17/2023
by   Łukasz Dębowski, et al.
7

It was observed that large language models exhibit a power-law decay of cross entropy with respect to the number of parameters and training tokens. When extrapolated literally, this decay implies that the entropy rate of natural language is zero. To understand this phenomenon – or an artifact – better, we construct a simple stationary stochastic process and its memory-based predictor that exhibit a power-law decay of cross entropy with the vanishing entropy rate. Our example is based on previously discussed Santa Fe processes, which decompose a random text into a process of narration and time-independent knowledge. Previous discussions assumed that narration is a memoryless source with Zipf's distribution. In this paper, we propose a model of narration that has the vanishing entropy rate and applies a randomly chosen deterministic sequence called a multiperiodic sequence. Under a suitable parameterization, multiperiodic sequences exhibit asymptotic relative frequencies given by Zipf's law. Remaining agnostic about the value of the entropy rate of natural language, we discuss relevance of similar constructions for language modeling.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2022

Rényi Cross-Entropy Measures for Common Distributions and Processes with Memory

Two Rényi-type generalizations of the Shannon cross-entropy, the Rényi c...
research
03/23/2023

The Quantization Model of Neural Scaling

We propose the Quantization Model of neural scaling laws, explaining bot...
research
05/05/2022

A Simple Contrastive Learning Objective for Alleviating Neural Text Degeneration

The cross-entropy objective has proved to be an all-purpose training obj...
research
11/23/2022

Using Focal Loss to Fight Shallow Heuristics: An Empirical Analysis of Modulated Cross-Entropy in Natural Language Inference

There is no such thing as a perfect dataset. In some datasets, deep neur...
research
10/07/2020

A Mathematical Exploration of Why Language Models Help Solve Downstream Tasks

Autoregressive language models pretrained on large corpora have been suc...
research
09/22/2020

Entropic Compressibility of Lévy Processes

In contrast to their seemingly simple and shared structure of independen...
research
06/14/2017

Is Natural Language a Perigraphic Process? The Theorem about Facts and Words Revisited

As we discuss, a stationary stochastic process is nonergodic when a rand...

Please sign up or login with your details

Forgot password? Click here to reset