Supermodular f-divergences and bounds on lossy compression and generalization error with mutual f-information

06/21/2022
by   Saeed Masiha, et al.
0

In this paper, we introduce super-modular -divergences and provide three applications for them: (i) we introduce Sanov's upper bound on the tail probability of sum of independent random variables based on super-modular -divergence and show that our generalized Sanov's bound strictly improves over ordinary one, (ii) we consider the lossy compression problem which studies the set of achievable rates for a given distortion and code length. We extend the rate-distortion function using mutual -information and provide new and strictly better bounds on achievable rates in the finite blocklength regime using super-modular -divergences, and (iii) we provide a connection between the generalization error of algorithms with bounded input/output mutual -information and a generalized rate-distortion problem. This connection allows us to bound the generalization error of learning algorithms using lower bounds on the rate-distortion function. Our bound is based on a new lower bound on the rate-distortion function that (for some examples) strictly improves over previously best-known bounds. Moreover, super-modular -divergences are utilized to reduce the dimension of the problem and obtain single-letter bounds.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset