Sharp finite-sample large deviation bounds for independent variables

08/30/2020
by   Akshay Balsubramani, et al.
4

We show an extension of Sanov's theorem in large deviations theory, controlling the tail probabilities of i.i.d. random variables with matching concentration and anti-concentration bounds. This result applies to samples of any size, and has a short information-theoretic proof using elementary techniques.

READ FULL TEXT

page 1

page 2

page 3

research
02/12/2022

An Information-Theoretic Proof of the Kac–Bernstein Theorem

A short, information-theoretic proof of the Kac–Bernstein theorem, which...
research
05/12/2014

Sharp Finite-Time Iterated-Logarithm Martingale Concentration

We give concentration bounds for martingales that are uniform over finit...
research
04/08/2021

An Information-Theoretic Proof of a Finite de Finetti Theorem

A finite form of de Finetti's representation theorem is established usin...
research
04/11/2023

A Third Information-Theoretic Approach to Finite de Finetti Theorems

A new finite form of de Finetti's representation theorem is established ...
research
12/10/2022

High-dimensional Berry-Esseen Bound for m-Dependent Random Samples

In this work, we provide a (n/m)^-1/2-rate finite sample Berry-Esseen bo...
research
06/25/2023

Dual Induction CLT for High-dimensional m-dependent Data

In this work, we provide a 1/√(n)-rate finite sample Berry-Esseen bound ...
research
01/30/2019

Support Recovery in the Phase Retrieval Model: Information-Theoretic Fundamental Limits

The support recovery problem consists of determining a sparse subset of ...

Please sign up or login with your details

Forgot password? Click here to reset