Two decades ago, I named the recently departed Luca Trevisan's paper connecting extractors to psuedorandom generators as one of my favorite theorems from 1995-2004. I'm dedicating this month's favorite theorem to him.
Suppose we have two independent sources with just a little bit of entropy each. Can I pull out a single random bit? This month's favorite theorem shows us how, with a nice application to constructing Ramsey graphs.
More formally (feel free to skip this part) suppose we had two independent distributions U and V each of poly log min-entropy, which means for every string x of length n, the probability of choosing x from U and the probability of choosing x from V is at most \(2^{-(\log n)^c}\) for some c. There is a deterministic polytime function (which doesn't depend on U and V) such that f(x,y) with x and y chosen independently from U and V will output 1 with probability \(1/2\pm\epsilon\) for \(\epsilon\) smaller than any polynomial.
Previous work required a linear amount of min-entropy for U and V.
As a corollary, we can use f to deterministically generate a Ramsey graph on n vertices with no cliques or independent sets of size \(2^{(\log\log n)^c}\) for a sufficiently large c. This is also an exponential improvement from previous constructions. Gil Cohen gave an independent construction that doesn't go through extractors.
There have been several papers improving the bounds of Chattopadhyay and Zuckerman. In FOCS 2023 Xin Li gave a construction of extractors with \(O(\log n)\) min-entropy, the current state-of-the-art for extracting a single random bit with constant error, and Ramsey graphs with no cliques or independent sets of size \(\log^c n\) for some constant c.
No comments:
Post a Comment