Continuity of Generalized Entropy and Statistical Learning

12/31/2020
by   Aolin Xu, et al.
0

We study the continuity property of the generalized entropy as a functional of the underlying probability distribution, defined with an action space and a loss function, and use this property to answer the basic questions in statistical learning theory, the excess risk analyses for various learning methods. We first derive upper and lower bounds for the entropy difference of two distributions in terms of several commonly used f-divergences, the Wasserstein distance, and a distance that depends on the action space and the loss function. Examples are given along with the discussion of each general result, comparisons are made with the existing entropy difference bounds, and new mutual information upper bounds are derived based on the new results. We then apply the entropy difference bounds to the theory of statistical learning. It is shown that the excess risks in the two popular learning paradigms, the frequentist learning and the Bayesian learning, both can be studied with the continuity property of different forms of the generalized entropy. The analysis is then extended to the continuity of generalized conditional entropy. The extension provides performance bounds for Bayes decision making with mismatched distributions. It also leads to excess risk bounds for a third paradigm of learning, where the decision rule is optimally designed under the projection of the empirical distribution to a predefined family of distributions. We thus establish a unified method of excess risk analysis for the three major paradigms of statistical learning, through the continuity of generalized entropy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/02/2022

Chained Generalisation Bounds

This work discusses how to derive upper bounds for the expected generali...
research
07/21/2021

Modulus of continuity of the quantum f-entropy with respect to the trace distance

A well-known result due to Fannes is a certain upper bound on the modulu...
research
03/12/2022

Maximization of Mathai's Entropy under the Constraints of Generalized Gini and Gini mean difference indices and its Applications in Insurance

Statistical Physics, Diffusion Entropy Analysis and Information Theory c...
research
04/30/2022

Orthogonal Statistical Learning with Self-Concordant Loss

Orthogonal statistical learning and double machine learning have emerged...
research
01/20/2014

Generalized Bhattacharyya and Chernoff upper bounds on Bayes error using quasi-arithmetic means

Bayesian classification labels observations based on given prior informa...
research
01/22/2021

Tighter expected generalization error bounds via Wasserstein distance

In this work, we introduce several expected generalization error bounds ...
research
08/23/2023

Unified framework for continuity of sandwiched Rényi divergences

In this work, we prove uniform continuity bounds for entropic quantities...

Please sign up or login with your details

Forgot password? Click here to reset