DeepAI AI Chat
Log In Sign Up

A Note on the Chernoff Bound for Random Variables in the Unit Interval

05/15/2022
by   Andrew Y. K. Foong, et al.
University of Cambridge
0

The Chernoff bound is a well-known tool for obtaining a high probability bound on the expectation of a Bernoulli random variable in terms of its sample average. This bound is commonly used in statistical learning theory to upper bound the generalisation risk of a hypothesis in terms of its empirical risk on held-out data, for the case of a binary-valued loss function. However, the extension of this bound to the case of random variables taking values in the unit interval is less well known in the community. In this note we provide a proof of this extension for convenience and future reference.

READ FULL TEXT

page 1

page 2

page 3

04/30/2019

The Littlewood-Offord Problem for Markov Chains

The celebrated Littlewood-Offord problem asks for an upper bound on the ...
05/03/2019

A Constructive Proof of a Concentration Bound for Real-Valued Random Variables

Almost 10 years ago, Impagliazzo and Kabanets (2010) gave a new combinat...
06/19/2022

Deterministic Finite-Memory Bias Estimation

In this paper we consider the problem of estimating a Bernoulli paramete...
10/06/2020

A Note on High-Probability versus In-Expectation Guarantees of Generalization Bounds in Machine Learning

Statistical machine learning theory often tries to give generalization g...
01/12/2021

A new method for constructing continuous distributions on the unit interval

A novel approach towards construction of absolutely continuous distribut...
06/02/2021

Statistical optimality conditions for compressive ensembles

We present a framework for the theoretical analysis of ensembles of low-...