The programenumeration bottleneck in averagecase complexity theory
by Luca Trevisan
Oded's comments
These three theorems of Levin are among my favorites.
I note that, while they are often cricitized for the overhead incurred
by the enumeration, this is a secondary consideration that arises only
one understand their proofs; that is, beforehand, one would probably
consider their statement to be bluntly wrong (or at least very challenging).
(Indeed, clever researchers tend to forget the state of mind they were
in before understanding a simple proof, and tell themselves that they
gained nothing from the theorem  but this comment belongs to my
opinion page on
What's wrong with STOC/FOCS?)
Anyhow, I always had a feeling that the effect of enumeration
is unavoidable in these results, but I guess I was not brave enough
to try to articulate this feeling (let alone instantiate it).
Note that the same may be true also with respect to the CookLevin
Theorem (i.e., NPcompleteness of SAT or of Bounded Halting),
but it is not clear how to articulate this in that content.
Indeed, the reduction to pathological subcases that goes "unpunished"
in worstcase complexity, has a clear cost in the context of
averagecase complexity, which is indeed a satisfying feature
of that theory.
The original abstract
Three fundamental results of Levin involve algorithms or reductions
whose running time is exponential in the length of certain programs.
We study the question of whether such dependency can be made polynomial.

Levin's ``optimal search algorithm'' performs at most a constant
factor more slowly
than any other fixed algorithm. The constant, however, is exponential
in the length
of the competing algorithm.
We note that the running time of a universal search cannot be made
``fully polynomial'' (that is, the relation between slowdown
and program length cannot be made polynomial), unless P=NP.

Levin's ``universal oneway function'' result has the following structure:
there is a polynomial time computable function fLevin such that if there is
a polynomial time computable adversary A that inverts fLevin on an inverse
polynomial fraction of inputs, then for every polynomial time
computable function g there also is a polynomial time
adversary Ag that inverts g on an inverse
polynomial fraction of inputs. Unfortunately,
again the running time of Ag depends
exponentially on the bit length of the program that computes g in
polynomial time.
We show that
a fully polynomial uniform reduction from an arbitrary oneway
function to a specific oneway function is not possible relative to an
oracle that
we construct, and so no ``universal oneway function'' can have a fully
polynomial security analysis via relativizing techniques.

Levin's completeness result for distributional NP problems implies
that if a specific
problem in NP is easy on average under the uniform distribution, then
every language L
in NP is also easy on average under any polynomial time computable
distribution. The
running time of the implied algorithm for L, however, depends
exponentially on the
bit length of the nondeterministic polynomial time Turing machine that
decides L.
We show that if a completeness result for distributional NP can be proved
via a ``fully uniform'' and ``fully polynomial'' time reduction, then
there is a worstcase
to averagecase reduction for NPcomplete problems. In particular, this means
that a fully polynomial completeness result for distributional NP is impossible,
even via randomized truthtable reductions, unless the polynomial
hierarchy collapses.
Back to
list of Oded's choices.