Direct Sums in Randomized Communication Complexity
by Boaz Barak, Mark Braverman, Anup Rao and Xi Chen.
Oded's comments
I would highlight more the new models and results regarding compressing
any communication protocol at the protocollevel rather than
the message level. Indeed, these models and results seem complementary
to Schulman et al treatment of errorcorrection at the protocol level.
Thus, these two models extend the classical treatments of compression
and errorcorrection from the single message level to the protocol level.
The direct sum result follows by starting with a protocol
for the $k$wise problem, deriving a protocol for a single
instance that has the same communication complexity but
a low "information contents", and then reducing the
communication complexity by using the new compression
(at protocol level) scheme.
The original abstract
Does computing several copies of a function require more computational
effort that computing a single copy? In this work, we give the first
answers to this question for the model of randomized communication
complexity. Ignoring logarithmic factors, we show that:

Computing n copies of a function requires sqrt{n} times the
communication.

For average case complexity, given any distribution mu on inputs,
computing n copies of the function on n independent inputs sampled
according to mu requires at least sqrt{n} times the communication for
computing one copy.

If mu is a product distribution, computing n copies on n independent
inputs sampled according to mu requires n times the communication.
We also study the complexity of computing the parity of n evaluations
of f, and obtain results analogous to those above.
Our results are obtained by designing compression schemes for
communication protocols that can be used to compress the communication
in a protocol that does not transmit a lot of information about its inputs.
Back to
list of Oded's choices.