I find the general technique very appealing. I wish to emphasize that the fact that property testing lower bounds can be obtained via communication complexity problems is highly non-trivial, since doing so calls for reducing a communication problem to a testing problem that (in general) lacks any a priori input partition (nor a natural clue on how to partition the tested input so that the inputs to the communication problem can be embedded in it).

Indeed, the new technique is a methodology, which as such involved a creative step. The user (i.e., lower bound prover) has to come up with a decomposition of the input that allows local reduction of the individual inputs (of the communication problem) into two parts that when composed back yield an input for the testing problem. Furthermore, the composition should be locally computable (i.e., its value at any desired point has to be determined by few bits in each of the two parts). Viewing the testing problem as referring to a set of functions, denoted PI, one should present a decomposition of functions into pairs of functions. Specifically, one should suggest a communication problem in which the inputs of the two parties are function, denoted $f$ and $g$, such that $h=compose(f,g)$ is in PI iff the pair $(f,g)$ is a yes-instance of the communication problem. An example, taken from Section 1.2, is indeed useful and follows.

Consider the task of proving a lower bound on the query complexity of testing $k$-linearity (i.e., whether the tested function is a linear function that depends on exactly $k$ variables). We use a reduction from the communication problem SET DISJOINTNESS, when restricted to instances in which the sets are of size $k$ (and the intersection is either empty or a singleton). The parties are given $k$-subsets, $A$ and $B$, of $[n]$, and (locally) form the linear functions $f(x)=\sum_{i\in A} x_i$ and $g(x)=\sum_{i\in B} x_i$, respectably. They use an algorithm for testing $2k$-linearity, invoked on the (imaginary) function $h=f+g$. The parties emulate the testing algorithm in the joint-randomness model so that whenever they need the value of $h$ at point $x$, they send each other the values $f(x)$ and $g(x)$. Finally, observe that $h$ is $2k$-linear iff $A$ and $B$ are disjoint (and otherwise $h$ is $(2k-2)$-linear).

Indeed, the foregoing description refers to the standard model of randomized communication complexity (i.e., the joint-randomness model). Using it adaptive testers yield standard two-way communication protocols, whereas non-adaptive testers yield one-way communication protocols. One-sided error probability is preserved. The point in all these comments is that they offer a variety of potential lower bound attempts while relying on different existing communication complexity lower bounds.

Let me mention that the new method allows settling all three conjectures of [23] (only one is addressed in the current write-up of this work, but the other two are doable too - details will appear in a forthcoming revision). However, this is not the reason for my excitment of the new method, but rather an excellent demonstration to its power and versatility.

P.S.: I found the paper
HERE.

Update (4/4/11): Posted as
ECCC TR11-045.

We develop a new technique for proving lower bounds in property testing, by showing a strong connection between testing and communication complexity. We give a simple scheme for reducing communication problems to testing problems, thus allowing us to use known lower bounds in communication complexity to prove lower bounds in testing. This scheme is general and implies a number of new testing bounds, as well as simpler proofs of several known bounds.

For the problem of testing whether a boolean function is k-linear (a parity function on k variables), we achieve a lower bound of Omega(k) queries, even for adaptive algorithms with two-sided error, thus confirming a conjecture of Goldreich [23]. The same argument behind this lower bound also implies a new proof of known lower bounds for testing related classes such as k-juntas. For some classes, such as the class of monotone functions, and the class of s-sparse GF(2) polynomials, we significantly strengthen the best known bounds.

Back to list of Oded's choices.