Average Bias and Polynomial Sources

05/28/2019 ∙ by Arnab Bhattacharyya, et al. ∙ 0

We identify a new notion of pseudorandomness for randomness sources, which we call the average bias. Given a distribution Z over {0,1}^n, its average bias is: b_av(Z) =2^-n∑_c ∈{0,1}^n |E_z ∼ Z(-1)^〈 c, z〉|. A source with average bias at most 2^-k has min-entropy at least k, and so low average bias is a stronger condition than high min-entropy. We observe that the inner product function is an extractor for any source with average bias less than 2^-n/2. The notion of average bias especially makes sense for polynomial sources, i.e., distributions sampled by low-degree n-variate polynomials over F_2. For the well-studied case of affine sources, it is easy to see that min-entropy k is exactly equivalent to average bias of 2^-k. We show that for quadratic sources, min-entropy k implies that the average bias is at most 2^-Ω(√(k)). We use this relation to design dispersers for separable quadratic sources with a min-entropy guarantee.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.