Samplers and extractors for unbounded functions

04/17/2019
by   Rohit Agrawal, et al.
0

Blasiok (SODA'18) recently introduced the notion of a subgaussian sampler, defined as an averaging sampler for approximating the mean of functions f:{0,1}^m →R such that f(U_m) has subgaussian tails, and asked for explicit constructions. In this work, we give the first explicit constructions of subgaussian samplers (and in fact averaging samplers for the broader class of subexponential functions) that match the best-known constructions of averaging samplers for [0,1]-bounded functions in the regime of parameters where the approximation error ε and failure probability δ are subconstant. Our constructions are established via an extension of the standard notion of randomness extractor (Nisan and Zuckerman, JCSS'96) where the error is measured by an arbitrary divergence rather than total variation distance, and a generalization of Zuckerman's equivalence (Random Struct. Alg.'97) between extractors and samplers. We believe that the framework we develop, and specifically the notion of an extractor for the Kullback-Leibler (KL) divergence, are of independent interest. In particular, KL-extractors are stronger than both standard extractors and subgaussian samplers, but we show that they exist with essentially the same parameters (constructively and non-constructively) as standard extractors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2022

Tight bounds for augmented KL divergence in terms of augmented total variation distance

We provide optimal variational upper and lower bounds for the augmented ...
research
11/07/2021

Extractors: Low Entropy Requirements Colliding With Non-Malleability

The known constructions of negligible error (non-malleable) two-source e...
research
04/14/2018

Non-Malleable Extractors and Codes in the Interleaved Split-State Model and More

We present explicit constructions of non-malleable codes with respect to...
research
06/10/2020

Optimal Bounds between f-Divergences and Integral Probability Metrics

The families of f-divergences (e.g. the Kullback-Leibler divergence) and...
research
01/10/2015

On model misspecification and KL separation for Gaussian graphical models

We establish bounds on the KL divergence between two multivariate Gaussi...
research
02/28/2019

Unifying computational entropies via Kullback-Leibler divergence

We introduce KL-hardness, a new notion of hardness for search problems w...
research
05/30/2023

What and How does In-Context Learning Learn? Bayesian Model Averaging, Parameterization, and Generalization

In this paper, we conduct a comprehensive study of In-Context Learning (...

Please sign up or login with your details

Forgot password? Click here to reset