Oded Goldreich - Current Teaching
Currently (for some time now), I am not teaching,
but I am willing to supervise student-led reading group on
These reading groups are intended for small groups
of highly motivated students,
who wish to learn the subject in depth as well as experiment with
reading material by themselves and brainstorming with other students.
No group should contain more than six students,
and the recommended size is 3-5.
The general suggested outline for the reading is provided by me
(see above), but the actual implementation will be up to the students.
Record of past teaching (partial)
Syllabi of reading groups
Introduction to complexity theory
Complexity Theory
is concerned with the study of the intrinsic difficulty
of computational tasks. This study tends to aim at generality:
It focuses on natural computational resources, and the effect
of limiting those on the class of problems that can be solved.
The (half-century) history of Complexity Theory has witnessed
two main research efforts (or directions).
The first direction is aimed towards actually establishing concrete
lower bounds on the complexity of problems,
via an analysis of the evolution of the process of computation.
Thus, in a sense, the heart of this direction is a ``low-level''
analysis of computation. Most research in circuit complexity
and in proof complexity falls within this category.
In contrast, a second research effort is aimed at exploring
the connections among computational problems and notions,
without being able to provide absolute statements regarding
the individual problems or notions.
This effort may be viewed as a ``high-level'' study of computation,
and the reading (see guidelines)
will be confined to it. Specific topics will include:
- revisiting the P-vs-NP Question and NP-Completeness;
- separation results for time and space complexity;
- non-uniform complexity (e.g., P/poly);
- the Polynomial-time Hierarchy;
- space complexity (focus on log-space and NL);
- randomized computations (RP and BPP);
- counting problems and their approximation (the class #P and approximate-#P);
- hardness amplification (from worst-case and within average-case);
- pseudorandomness (generators and derandomization);
- probabilistic proof systems (IP, ZK, and PCP);
- approximation (optimization/search and property-testing/decision);
- average-case complexity;
This course has some overlap with the course
foundations of cryptography.
Specifically, one-way functions, pseodorandom generators,
and zero-knowledge proofs are covered in both courses,
but the perspective and the actual contents are different.
Foundations of Cryptography
The foundations of cryptography
are the paradigms, approaches
and techniques used to conceptualize, define and provide solutions
to natural security concerns. The reading will present some of
these conceptual tools as well as some of the fundamental results
obtained using them. The emphasis is on the clarification
of fundamental concepts and on demonstrating the feasibility
of solving several central cryptographic problems.
In the first semester we will focus on basic tools such as
one-way functions,
pseudorandomness and zero-knowledge proofs.
These will be used, in the second semester, which will provide
a rigorous treatment of three basic applications:
Encryption, Signatures,
and General Cryptographic Protocols.
See guidelines for the reading.
This course has some overlap with the course
introduction to complexity theory.
Specifically, one-way functions, pseodorandom generators,
and zero-knowledge proofs are covered in both courses,
but the perspective and the actual contents are different.
Introduction to property testing
Property Testing
is concerned with approximate decisions, where the
task is distinguishing between objects having a predetermined property
and objects that are ``far'' from having this property.
Typically, objects are modeled by functions,
and distance between functions is measured as the fraction
of the domain on which the functions differ.
We consider (randomized) algorithms that may query the function
at arguments of their choice,
and seek algorithms of very low complexity
(i.e., much lower than the size of the function's domain).
This reading provides a taste of the area. Specific topics include:
- Introduction, motivation, and mind-frame (Sec 1.1-1.2);
- Main notions and general results (Sec 1.3.1, 1.3.3, and 1.3.5);
- Testing Linearity (Chap 2);
- Low Degree Tests (Sec 3.1 and 3.4) [background Sec 3.2-3.3];
- Testing monotonicity on the Boolean hypercube (Sec 4.2.1)
and on the line (Sec 4.3.1) [and beyond Sec 4.4.0];
- Testing Dictatorship (Sec 5.2.1 and 5.2.3) and Juntas (Sec 5.3);
- Testing by Implicit Sampling (Sec 6.2);
- Lower Bounds Techniques (Sec 7.2 and 7.3);
- Testing Graph Properties in the Dense Graph Model
(Sec 8.2 and 8.3 [and Sec 8.4]);
- Testing Graph Properties in the Bounded-Degree Graph Model
(Sec 9.1, 9.2.3, 9.2.4, 9.3.1, and 9.4.1 [and Sec 9.5]);
- Testing Graph Properties in the General Graph Model
(Sec 10.1, 10.2.2, and 10.3.2.1);
- Testing Properties of Distributions (Sec 11.1 and 11.2).
Material is square brackets is optional.
Back to homepage.