Prediction from Partial Information and Hindsight, with Application to Circuit Lower Bounds

by Or Meir and Avi Wigderson

Oded's comments

I find the ``extraction problem'' considered in the paper interesting per se. It asks about the residual randomness of a bit in a string having very high entropy after making few queries to other bits of the string. The application to circuit lower bounds (i.e., Thm 1.9) is also very interesting.

The original abstract

Consider a random sequence of $n$ bits that has entropy at least $n-k$, where $k\ll n$. A commonly used observation is that an average coordinate of this random sequence is close to being uniformly distributed, that is, the coordinate looks random. In this work, we prove a stronger result that says, roughly, that the average coordinate looks random to an adversary that is allowed to query $\approx\frac{n}{k}$ other coordinates of the sequence, even if the adversary is non-deterministic. This setting generalizes decision trees and certificates for Boolean functions.

As an application of this result, we prove a new result on depth-$3$ circuits, which recovers as a direct corollary the known lower bounds for the parity and majority functions, as well as a lower bound on sensitive functions due to Boppana (IPL 63(5), 1997). An interesting feature of this proof is that it works in the framework of Karchmer and Wigderson (SIDMA 3(2), 1990), and in particular it is a top-down proof (Hastad, Jukna and Pudlak, Computational Complexity 5(2), 1995). Finally, it yields a new kind of a random restriction lemma for non-product distributions, which may be of independent interest.

See ECCC TR17-149.


Back to list of Oded's choices.