(As in Nr 383, I missed this work before, due to the same reasons.)
This work refers to testing graph properties in the dense graph model (aka the adjacency materix model). Hence, distances are measured a a fraction of the number of disagreements over the size of the adjacency matrix (i.e., the number of vertices squared).
I am most excited about the result stating that the complexity of distance approximation is at most doublly-exponential in the complexity of testing.
(I strongly object to the convention of calling a property testable or estimable when the relevant query complexity depends on the proximity parameter only. One should just use the latter term.)
A distance estimator for a graph property P is an algorithm that given G and positive alpha and eps, distinguishes between the case that G is (alpha-eps)-close to P and the case that G is alphafar from P (in edit distance). We say that P is estimable if it has a distance estimator whose query complexity depends only on esp. Every estimable property is also testable, since testing corresponds to estimating with alpha=eps (equiv. alpha-eps=0). A central result in the area of [graph] property testing, the Fischer--Newman theorem, gives an inverse statement: every testable property is in fact estimable. The proof of Fischer and Newman was highly ineffective, since it incurred a tower-type loss when transforming a testing algorithm for P into a distance estimator. This raised the natural problem, studied recently by Fiat--Ron and by Hoppen--Kohayakawa--Lang--Lefmann--Stagni, whether one can find a transformation with a polynomial loss. We obtain the following results.
See arXiv 2305.05487