Oded Goldreich - Academic Profile

My research interests lie within the Theory of Computation. Below is a brief introduction to that discipline, followed by a brief description of my specific areas of research and a brief summary of some of my research and expository contributions.

See also ``press releases'', periodical statements, Curriculum Vitae, (annotated) list of publications, and list of papers available on-line.

Last updated: July 2007.


A Brief Introduction to the Theory of Computation

Computer Science is a cluster of related scientific and engineering disciplines concerned with the study and application of computations. These disciplines range from the pure and basic scientific discipline concerned with the foundations (or theory) of computer science (or of computation) to engineering disciplines concerned with specific applications.

The foundations (or theory) of computer science can be partitioned into two sub-disciplines: one concerned with the Theory of Computation, and the other concerned with the Theory of Programming. The Theory of Computation aims at understanding the nature of computation, and specifically the inherent possibilities and limitations of efficient computations. The Theory of Programming is concerned with the actual task of implementing computations (i.e., writing computer programs).

The Theory of Computation is a scientific discipline concerned with the study of general properties of computation be it natural, man-made, or imaginary. Most importantly, it aims to understand the nature of efficient computation. A few examples that are related to my specific research interests follow

The nature of efficient computation (and computation in general) is indeed the formative question of the Theory of Computation. I consider this question (or rather a cluster of questions) to be one of the most fundamental scientific questions ever asked. Unfortunately, the fundamental status of this question is usually disregarded due to its immediate technological impact (as argued next).

The revolutionary impact of a technology (in our case the Computing Technology) on our society does not necessarily facilitate the appreciation of the intellectual contents of the theory underlying it (in our case the Theory of Computation). Typically, people are so overwhelmed by the wonders of the technology that they fail to wonder about the theory underlying it. Specifically, people tend not to think of computing in general terms but rather in the concrete terms in which they have lastly encountered it. Consequently, the intellectual contents of the Theory of Computation is rarely communicated and rarely understood (by non-specialists). An attempt to redeem this state of affairs is made in the material available from the web-page Theory of Computing - A Scientific Perspective.


My Specific Areas of Research

The theory of computation can be sub-divided to numerous overlapping areas. Two main clusters of areas are complexity theory and algorithms, where the distinction is on whether the focus is on the computational resources (as in complexity theory) or on the tasks to be solved (as in algorithms). Indeed, complexity theory is sub-divided according to the model and resources of computation (i.e., time complexity, space complexity, circuit complexity, communication complexity, proof complexity, etc), whereas algorithms are sub-divided according to the respective tasks (e.g., graph algorithms, linear programming, approximation algorithms, computational number theory, computational geometry, etc). In addition, there are areas of the theory of computation that should not be forced into the above two clusters. Examples include Cryptography, Distributed Computing, and Computational Learning Theory.

My main research areas are

In addition, I am interested in complexity theory at large, and have some research experience in Distributed Computing.

The study of various notions of pseudorandomness

A fresh view at the question of randomness was taken in the theory of computation: It has been postulated that a distribution is pseudorandom if it cannot be told apart from the uniform distribution by any efficient procedure. This paradigm, originally associating efficient procedures with polynomial-time algorithms, has been applied also with respect to a variety of limited classes of such distinguishing procedures.

The study of various types of probabilistic proof systems

Various types of probabilistic proof systems have played a central role in the development of the theory of computation in the last two decades. These proof systems share a common (untraditional) feature - they carry a probability of error; yet, this probability is explicitly bounded and can be reduced by successive application of the proof system. The gain in allowing this untraditional relaxation is substantial, as demonstrated by three well known results regarding interactive proofs, zero-knowledge proofs, and probabilistic checkable proofs: In each of these cases, allowing a bounded probability of error makes the system much more powerful and useful than the traditional (error-less) counterparts.

The study of property testing, a notion of approximation for decision problems

Property Testing is the study of super-fast (randomized) algorithms for approximate decision making. These algorithms are given direct access to items of a huge data set, and determine whether this data set has some predetermined (global) property or is ``far'' from having this property. Remarkably, this approximate decision is made by accessing a small portion of the data set. That is, the focus of this study is on algorithms that make an approximate decision without looking at the entire object (and sometimes even running in constant time!).

The foundations of cryptography

Whereas classical cryptography was confined to the art of designing and breaking encryption schemes (or ``secrecy codes''), Modern Cryptography is concerned with the rigorous analysis of any system that should withstand malicious attempts to abuse it. We emphasize two aspects of the transition from classical to modern cryptography: (1) the widening of scope from one specific task to an utmost general class of tasks; and (2) the move from an engineering-art which strives on ad-hoc tricks to a scientific discipline based on rigorous approaches and techniques. The foundations of cryptography are the paradigms, approaches and techniques used to conceptualize, define and provide solutions to natural cryptographic problems. Currently, research in this area includes:


Some of my Research Contributions

My most important research contributions are:

Additional research contributions that I'd like to highlight are arranged in the following (somewhat overlapping) categories. A full list of my research contributions is available in various formats. See also the list of my works available on-line.

Contributions to the study of one-way functions and pseudorandomness

Contributions to the study of derandomization and weak sources of randomness

Contributions to the study of various types of probabilistic proof systems

Probabilistic Checkable Proofs (PCP): Statistical Zero-Knowledge (SZK): Interactive Proofs (IP): Knowledge Complexity (KC):

Contributions to the study of (computational) zero-knowledge and related topics

Contributions to other aspects of the foundations of cryptography

Contributions to the study of property testing

Contributions to the study of Distributed Computing

Contributions to other aspects of the theory of computing


Expository Contributions: Books, Lecture Notes and Surveys

I have spent a considerable amount of time in trying to communicate the main ideas developed in the research areas in which I have expertise. This led me to write numerous surveys and a few books and lecture notes. My main expository work are briefly described below.

Computational Complexity: A Conceptual Perspective (2008). This book offers a conceptual perspective on complexity theory. It is intended to serve as an introduction to the field, and can be used either as a textbook or for self-study. The book is also useful to experts, since it provides expositions of various sub-areas of complexity theory. The exposition starts from the intuitive questions, as embodied in the concepts studies in each sub-area, and discusses the importance of these questions, definitional choices made in the actual formulation, and the approaches and ideas hat underly the answers.

Foundations of Cryptography: a two-volume textbook (Vol1, 2001; Vol2, 2004). This book is aimed at presenting firm foundations for cryptography; that is, present the paradigms, approaches and techniques used to conceptualize, define and provide solutions to natural ``security concerns'' as well as some of the fundamental results obtained using them. The emphasis is on the clarification of fundamental concepts and on demonstrating the feasibility of solving several central cryptographic problems.

Modern Cryptography, Probabilistic Proofs and Pseudorandomness (1998). The interplay between randomness and computation is one of the most fascinating scientific phenomena uncovered in the last couple of decades. This interplay is at the heart of modern cryptography and plays a fundamental role in complexity theory at large. Specifically, the interplay of randomness and computation is pivotal to several intriguing notions of probabilistic proof systems and is the focal of the computational approach to randomness. This book provides an introduction to these three, somewhat interwoven domains (i.e., cryptography, proofs and randomness).

Randomized Methods in Computation - Lecture Notes (2001). A variety of randomized methods are being employed in the study of computation. The aim of the current course is to make the students familiar with some of these methods. We wish to stress three aspects regarding this course: this course focuses on methods (i.e., tools and techniques) and not on concepts, these methods are probabilistic and so the result is often not ``(fully) explicit'', and the focus is on applications to the study of computation.


Other activities: Guiding students and Editorial work

A partial list of students that I guided includes Boaz Barak (Ph.D. 04); Ran Canetti (M.Sc. 92, Ph.D. 95); Guy Even (M.Sc. 91); Iftach Haitner (M.Sc. 03); Amir Herzberg (D.Sc. 91); Hugo Krawczyk (D.Sc. 90); Eyal Kushilevitz (M.Sc. 89); Yehuda Lindell (Ph.D. 02); Noam Livne (Ph.D. 10); Or Meir (M.Sc. 07, Ph.D. 11); Erez Petrank (M.Sc. 92, D.Sc. 95); Alon Rosen (Ph.D. 03). For a full list click HERE.

I edited a book in memory of Shimon Even and a collection of surveys and research summaries on property testing. I served (or still serving) on the editorial board of SICOMP (1996-2010), Journal of Cryptology (1992-2011), Computational Complexity (since 2003) and ECCC (since founding in 1994). I served as chair of the steering committee of the Theory of Cryptography Conference (2005-13) and co-organizer the Oberwolfach Meeting on Complexity Theory (since 1994).


Back to homepage.