DeepAI AI Chat
Log In Sign Up

Information-theoretic convergence of extreme values to the Gumbel distribution

by   Oliver Johnson, et al.
University of Bristol

We show how convergence to the Gumbel distribution in an extreme value setting can be understood in an information-theoretic sense. We introduce a new type of score function which behaves well under the maximum operation, and which implies simple expressions for entropy and relative entropy. We show that, assuming certain properties of the von Mises representation, convergence to the Gumbel can be proved in the strong sense of relative entropy.


page 1

page 2

page 3

page 4


Entropy and relative entropy from information-theoretic principles

We introduce an axiomatic approach to entropies and relative entropies t...

Supervised Learning of Labeled Pointcloud Differences via Cover-Tree Entropy Reduction

We introduce a new algorithm, called CDER, for supervised machine learni...

Improving Supervised Phase Identification Through the Theory of Information Losses

This paper considers the problem of Phase Identification in power distri...

Information Density as a Factor for Variation in the Embedding of Relative Clauses

In German, relative clauses can be positioned in-situ or extraposed. A p...

On the information-theoretic formulation of network participation

The participation coefficient is a widely used metric of the diversity o...

An Information-Theoretic Analog of the Twin Paradox

We revisit the familiar scenario involving two parties in relative motio...