Information-theoretic convergence of extreme values to the Gumbel distribution

07/07/2020
by   Oliver Johnson, et al.
0

We show how convergence to the Gumbel distribution in an extreme value setting can be understood in an information-theoretic sense. We introduce a new type of score function which behaves well under the maximum operation, and which implies simple expressions for entropy and relative entropy. We show that, assuming certain properties of the von Mises representation, convergence to the Gumbel can be proved in the strong sense of relative entropy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2020

Entropy and relative entropy from information-theoretic principles

We introduce an axiomatic approach to entropies and relative entropies t...
research
02/26/2017

Supervised Learning of Labeled Pointcloud Differences via Cover-Tree Entropy Reduction

We introduce a new algorithm, called CDER, for supervised machine learni...
research
11/04/2019

Improving Supervised Phase Identification Through the Theory of Information Losses

This paper considers the problem of Phase Identification in power distri...
research
05/18/2017

Information Density as a Factor for Variation in the Embedding of Relative Clauses

In German, relative clauses can be positioned in-situ or extraposed. A p...
research
07/24/2023

On the information-theoretic formulation of network participation

The participation coefficient is a widely used metric of the diversity o...
research
09/21/2023

An Information-Theoretic Analog of the Twin Paradox

We revisit the familiar scenario involving two parties in relative motio...

Please sign up or login with your details

Forgot password? Click here to reset