The introduction of the boosting technique to the context of differential privacy marks another step towards linking this context to more traditional parts of the theory of computation. I am also very interested in the probabilistic inequality that underlies the improved bounds for natural composition operations.
We consider a trusted curator who maintains a database of sensitive information about a population of participants, and wants to release privacy-preserving answers to statistical queries about the population. A successful research program has, in the last few years, formulated the rigorous privacy guarantee of differential privacy [Dwork McSherry Nissim and Smith '06] and provided both feasibility results and lower bounds for differentially private data analysis. We introduce new tools for designing differentially private algorithms: 1. A new boosting methodology that lets us improve the accuracy guarantees of weak differentially private algorithms to obtain stronger more accurate algorithms. 2. An improved understanding of composition for differentially private analyses. We show that privacy guarantees degrade more slowly under composition than was previously known. We use these tools to show that, computational complexity aside, differential privacy permits surprisingly rich statistical data analyses.