A New Approach for Testing Properties of Discrete Distributions

by Ilias Diakonikolas and Daniel Kane

Oded's comments

I have already mentioned this work in my report of The Sublinear Algorithms Workshop (JHU, Jan. 2016), but feel it deserve a more conspicuous mention.

This work presents a very appealing framework for deriving testers for properties of distributions. It suggests to "flatten" the given distributions such that their L2-norm is utmost small, while preserving their distance to the property in question, and then apply a basic tester that has optimal complexity for distributions if small L2-norm. Using this transformation, one can reduce various property testing problems to the corresponding case (of small L2-norm), offering a unified and simple way of establishing many known results.

I used this approach when teaching the subject of testing properties of distributions (see my lecture notes). The approach has also inspired a work of mine that shows that the uniform distribution is complete with respect to testing identity to a fixed distribution).

The original abstract

We study problems in distribution property testing: Given sample access to one or more unknown discrete distributions, we want to determine whether they have some global property or are $\epsilon$-far from having the property in $\ell_1$ distance (equivalently, total variation distance, or ``statistical distance''). In this work, we give a novel general approach for distribution testing. We describe two techniques: our first technique gives sample-optimal testers, while our second technique gives matching sample lower bounds. As a consequence, we resolve the sample complexity of a wide variety of testing problems.

Our upper bounds are obtained via a modular reduction-based approach. Our approach yields optimal testers for numerous problems by using a standard $\ell_2$-identity tester as a black-box. Using this recipe, we obtain simple estimators for a wide range of problems, encompassing most problems previously studied in the TCS literature, namely:
(1) identity testing to a fixed distribution,
(2) closeness testing between two unknown distributions (with equal/unequal sample sizes), (3) independence testing (in any number of dimensions),
(4) closeness testing for collections of distributions, and
(5) testing histograms. For all of these problems, our testers are sample-optimal, up to constant factors.
With the exception of (1), ours are the {\em first sample-optimal testers for the corresponding problems.} Moreover, our estimators are significantly simpler to state and analyze compared to previous results.

As an important application of our reduction-based technique, we obtain the first {\em nearly instance-optimal} algorithm for testing equivalence between two {\em unknown} distributions. The sample complexity of our algorithm depends on the {\em structure of the unknown distributions} -- as opposed to merely their domain size -- and is much better compared to the worst-case optimal $\ell_1$-tester in most natural instances. Moreover, our technique naturally generalizes to other metrics beyond the $\ell_1$-distance. As an illustration of its flexibility, we use it to obtain the first near-optimal equivalence tester under the Hellinger distance.

Our lower bounds are obtained via a direct information-theoretic approach: Given a candidate hard instance, our proof proceeds by bounding the mutual information between appropriate random variables. While this is a classical method in information theory, prior to our work, it had not been used in distribution property testing. Previous lower bounds relied either on the birthday paradox, or on moment-matching and were thus restricted to symmetric properties. Our lower bound approach does not suffer from any such restrictions and gives tight sample lower bounds for the aforementioned problems.

See ECCC TR16-074.


Back to list of Oded's choices.