THE 2009 OBERWOLFACH MEETINGS ON COMPLEXITY THEORY LACONIC and PERSONAL IMPRESSIONS OF ODED GOLDREICH Following the initiation of short reports in 2007, this time we had 5-minute reports by almost all participants. My own feeling is that these reports were very beneficial. In addition, I wish to highlight an informal specialized "session" organized by Or Meir with the focus on Probabilistic Proof Systems. The session, which took the form of a panel discussion, with almost all attendees serving as imformal panelists, was initiated with the questions "what's now?" The comments made included a few short reports of various works, various suggestions of well known and less known open problems that deserve our attention, and various proposals on how they can be addressed (see a more detailed account below). *In my opinion, this "session" is really the type of things that workshop like the Oberwolfach Meeting should focus at*. Among the presentations I heard, I learned most from the following Scott Aaronson: Efficient Simulation of Quantum Mechanics Collapses PH Mark Braverman: Polylogarithmic independence fools AC0 circuits Prasad Raghavendra: Complexity of CSPs: Exact and Approximation. Omer Reingold: On Constructing Pseudorandom Generators from One-Way Functions Ben Rossman: Random Graphs and AC0 Indistinguishability I also enjoyed very much hearing the following presentations of work I knew well. Irit Dinur: 2-query composition of PCPs Iftach Haitner: Parallel Repetition of Computationally-Sound Protocols Valentine Kabanets: Direct-Product Decoding and Testing Salil Vadhan: On "Inaccessible Entropy" For comments regarding these works (and others) see "my choices" page. [written: Nov 2009] ENCL: A brief summary of the "open session" organized by Or Meir. +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Or Meir organized an ``open session'' on Probabilistic Proof Systems. In his opening comments he asked participants to express their opinion about the most important open problems regarding IP and PCP. Or himself named the characterization of LTCs and proving lower bounds on the length of PCPs. Oded said that even separating LTCs from error-correcting codes of similar rate and distance would be nice, and Swastic commented that such separation follows by bound on LDPCs, leading Oded to rephrase the challenge as separating LTCs from LDPCs. Russell than reported on an attempt of his to prove lower bounds on the length of PCPs, which only yielded partial results about unique games. At the participants request, Or gave an overview of his combinatorial construction of super-fast PCPs (i.e., PCPs with a verifier that operates in time that is polylogarithmic in the length of the original proof). Thilo gave an overview of his construction of super-fast PCPs with almost-linear length (i.e., polylogarithmic time linear length); this provides a super-fast version of Irit's celebrated result. Some of the techniques used by Thilo also arise in Or's work. Dana mentioned the sliding scale conjecture, which asserts that it is possible to construct a two-query (or just constant-query) PCPs with arbitrary soundness error such that the error probability is exponentially decreasing in the answer length. Salil suggested that PCP theorems should be stated as a strong type of Levin reductions that require an efficient reconstruction of oracles that make the verifier accept (with probability above the soundness error) into (error-less) NP-witnesses for membership. A discussion regarding which constructions provide this feature followed, with a feeling that the only obstacle is the use of the parallel repetition theorem. While the feeling is that in all known construction the adequate oracle can be constructed (from the original NP-witness) in polynomial-time, Or asked whether these transformations can be implemented in almost-linear time. Irit mentioned that her recent joint work with Or uses an embedding of PCPs in de-Bruijn graphs that incurs a logarithmic factor increase in the error probability. She asked posed the problem of obtaining a similar embedding in "nice" graphs while incurring only a constant factor increase in the error probability.