This work reports another step in the research journey towards better derandomization of space-bounded randomized computation, where prior recent steps were recommended by me in postings number 283 and 299. Again, favorite themes such as first obtaining a mild approximation and then improving it (via the Richardson iteration) play a pivotal role.* In addition, the paper provides an exceptionally nice exposition, which seems accessible also to non-experts.
*)
I view this as a special case of the
ZigZag Method.
Update (by Ted Pyne, 17-Nov-22):
This work was recently superseded by a pair of papers,
ECCC TR22-149
and ECCC TR22-150,
which obtain near-optimal complexity for iterated matrix multiplication
(whereas the prior result held for powering a single matrix):
Interestingly, although the final result is the same,
the ideas used to go beyond the single-matrix case
are totally different.
We give a deterministic space-efficient algorithm
for approximating powers of stochastic matrices.
On input a $w \times w$ stochastic matrix $A$,
our algorithm approximates $A^{n}$
in space $\widetilde{O}(\log n + \sqrt{\log n}\cdot \log w)$
to within high accuracy.
This improves upon the seminal work by Saks and Zhou (FOCS95),
that requires $O(\log^{3/2}n + \sqrt{\log n} \cdot \log w)$ space,
in the regime $n \gg w$.
Available from
ECCC TR22-008.
The original abstract
Back to
list of Oded's choices.