Time Hierarchies for Sampling Distributions

by Thomas Watson

Oded's comments

This work is actually a merger of two remotely related works. The first work (or part) provides a time-hierarchy result for the task of (approximately) generating distributions over $k\geq2$ elements, while noting that the case of $k=1$ corresponds to BPtime for which no such result is known. Specifically, for every $k\geq2$ and every polynomial bound $p$, there exists a probability ensemble $(D_n)_{n\in\N}$ over $[k]$ that can be sampled in time $p$ but cannot be approximately-sampled in any polynomial-time (i.e., any polynomial-time sampler produce an ensemble that (on infinitely many $n$'s) is almost as far from $D$ as the uniform distribution).

The proof uses a reduction to a new type of (list) decoding problem, which refers to codewords over an alphabet of size $k$ and to an error model in which each symbol of the codeword is replaced by a non-trivial subset of $[k]$ that contains it (i.e., the symbol $sigma$ may be replace by any subset $A$ such that $sigma in A \neq [k]$).

The foregoing problem can be solved by a reduction to standard list decoding and by using an off-the-shelf list decoding result. Specifically, given a corrupted word $(A_1,....,A_n)$, we may consider $k-1$ instances of standard list decoding, where in the $i$th instance we refer to the word $(a_1,...,a_n)$ such that $a_j$ is the $i$th element of $A_j$. By using a suitable list decodable code, we may obtain a list of $poly(k)$ candidates and this will do for the proof of the time hierarchy (*).

However, the author seeks a solution that is optimal with respect to the list size (i.e., list size $k-1$) and simpler and self-contained. This is provided in thesecond part of the work.

*) The crucial aspects here are that list decoding is feasible at agreement rate $1/(k-1) > 1/k$ with a list size that only depends on $k$, and the construction holds for any value of $k\geq2$ (rather than certain values, e.g., primes). On the other hand, the iformation rate of the code is less crucial, as long as the block-length is polynomial in the information contents.

The original abstract

We prove that for every constant $k \geq 2$, every polynomial time bound $t$, and every polynomially small $eps$, there exists a family of distributions on $k$ elements that can be sampled exactly in polynomial time but cannot be sampled within statistical distance $1-(1/k)-eps$ in time $t$. Our proof involves reducing the problem to a communication problem over a certain type of noisy channel. We solve the latter problem by giving a construction of a new type of list-decodable code, for a setting where there is no bound on the number of errors but each error gives more information than an erasure.

Back to list of Oded's choices.