Statistical Difference Beyond the Polarizing Regime

by Itay Berman, Akshay Degwekar, Ron D. Rothblum, and Prashant Nalini Vasudevan

Oded's comments

The feeling that statistical zero-knowledge (SZK) happens to be related to computational problems regarding various statistical measures is amplified by this work. While the foregoing relation was previously known to hold for statistical distance and entropy difference (see survey), the current paper shows this relation for two other statistical measures. Interestingly, in both cases, the gap problems that are shown to be SZK-complete have an arbitrary noticeable gap in the parameters characterizing the close and far instances, unlike the known result for statistical distance (where the parameter charaterizing the close instances has to be larger than a square of the the parameter charaterizing the far instances).

The original abstract

The polarization lemma for statistical distance (SD), due to Sahai and Vadhan (JACM, 2003), is an efficient transformation taking as input a pair of circuits $(C_0,C_1)$ and an integer $k$ and outputting a new pair of circuits $(D_0,D_1)$ such that if $SD(C_0,C_1)\geq\alpha$ then $SD(D_0,D_1) \geq 1-2^{-k}$ and if $SD(C_0,C_1) \leq \beta$ then $SD(D_0,D_1) \leq 2^{-k}$. The polarization lemma is known to hold for any constant values $\beta < \alpha^2$, but extending the lemma to the regime in which $\alpha^2 \leq \beta < \alpha$ has remained elusive. The focus of this work is in studying the latter regime of parameters. Our main results are:

  1. Polarization lemmas for different notions of distance, such as Triangular Discrimination (TD) and Jensen-Shannon Divergence (JS), which enable polarization for some problems where the statistical distance satisfies $\alpha^2 < \beta < \alpha$. We also derive a polarization lemma for statistical distance with any inverse-polynomially small gap between $\alpha^2$ and $\beta$ (rather than a constant).
  2. The average-case hardness of the statistical difference problem (i.e., determining whether the statistical distance between two given circuits is at least $\alpha$ or at most $\beta$), for any values of $\beta < \alpha$, implies the existence of one-way functions. Such a result was previously only known for $\beta < \alpha^2$.
  3. A (direct) constant-round interactive proof for estimating the statistical distance between any two distributions (up to any inverse polynomial error) given circuits that generate them.
See ECCC TR19-038.


Back to list of Oded's choices.