## Worst-Case Running Times for Average-Case Algorithms

by L. Antunes and L. Fortnow

This work builds on an earlier work of the authors and Vinodchandran (2003), who showed that the exact (i.e., pointwise) complexity of a problem (or rather any pointwise behavior of an algorithm) that is polynomial on the average w.r.t a certain distribution is characterized by a function related to that distribution. Specifically, both the distribution and the pointwise bound are determined by the polynomial-time Kolmogorov distribution.
The current work goes beyond the previous one by proving that, under standard assumptions that allow derandomization, the class of polynomial-time Kolmogorov distribution is "universal" for the class of P-samplable distributions. Here universality means that the latter distributions dominate all P-samplable distributions, which seems to be of independent interest.
[To appear in CCC'09.]

#### The original abstract

Under a standard hardness assumption we exactly characterize the worst-case running time of languages that are in average polynomial-time over all polynomial-time samplable distributions. More precisely we show that if exponential time is not infinitely often in subexponential space, then the following are equivalent for any algorithm $A$:
• For all $\p$-samplable distributions $\mu$, $A$ runs in time polynomial on $\mu$-average.
• For all polynomial $p$, the running time for A is bounded by $2^{O(K^p(x)-K(x)+\log(|x|))}$ for \emph{all} inputs $x$.
where $K(x)$ is the Kolmogorov complexity (size of smallest program generating $x$) and $K^p(x)$ is the size of the smallest program generating $x$ within time $p(|x|)$.
To prove this result we show that, under the hardness assumption, the polynomial-time Kolmogorov distribution, $m^p(x)=2^{-K^p(x)}$, is universal among the P-samplable distributions.

Back to list of Oded's choices.