Computational Complexity: A Conceptual Perspective

[drafts of a book by Oded Goldreich]

See copyright notice.

Below is the book's tentative preface and Organization. The following versions are available on-line.

See also related material.

Last updated: May 2007.

Preface (tentative, Jan. 2006)

The strive for efficiency is ancient and universal, as time and other resources are always in shortage. Thus, the question of which tasks can be performed efficiently is central to the human experience.

A key step towards the systematic study of the aforementioned question is a rigorous definition of the notion of a task and of procedures for solving tasks. These definitions were provided by computability theory, which emerged in the 1930's. This theory focuses on computational tasks, and considers automated procedures (i.e., computing devices and algorithms) that may solve such tasks.

In focusing attention on computational tasks and algorithms, computability theory has set the stage for the study of the computational resources (like time) that are required by such algorithms. When this study focuses on the resources that are necessary for any algorithm that solves a particular task (or class of tasks), the study becomes part of the theory of Computational Complexity (also known as Complexity Theory).

Complexity Theory is a central field of the theoretical foundations of Computer Science. It is concerned with the study of the intrinsic complexity of computational tasks. That is, a typical Complexity theoretic study looks at a task (or a class of tasks) and at the computational resources required to solve this task, rather than at a specific algorithm or algorithmic scheme. Actually, research in Complexity Theory tends to start with the computational resources themselves, and addresses the effect of limiting these resources on the class of tasks that can be solved. Thus, Computational Complexity is the study of the what can be achieved within limited time (and/or other limited natural computational resources).

The (half-century) history of Complexity Theory has witnessed two main research efforts (or directions). The first direction is aimed towards actually establishing concrete lower bounds on the complexity of problems, via an analysis of the evolution of the process of computation. Thus, in a sense, the heart of this direction is a ``low-level'' analysis of computation. Most research in circuit complexity and in proof complexity falls within this category. In contrast, a second research effort is aimed at exploring the connections among computational problems and notions, without being able to provide absolute statements regarding the individual problems or notions. This effort may be viewed as a ``high-level'' study of computation. The theory of NP-completeness as well as the studies of approximation, probabilistic proof systems, pseudorandomness and cryptography all fall within this category.

The current book focuses on the latter effort (or direction). We list several reasons for our decision to focus on the ``high-level'' direction. The first is the great conceptual significance of the known results; that is, many known results (as well as open problems) in this direction have an appealing conceptual message, which can also be appreciated by non-experts. Furthermore, these conceptual aspects may be explained without entering into excessive technical detail. Consequently, the ``high-level'' direction is more suitable for an exposition in a book of the current nature. Finally, there is a subjective reason: the ``high-level'' direction is within our own expertise, while this cannot be said about the ``low-level'' direction.

The last paragraph brings us to a discussion of the nature of the current book, which is captured by the subtitle (i.e., ``a conceptual perspective''). Our thesis is that complexity theory is extremely rich in conceptual content, but this content is rarely communicated (explicitly) in books and/or extensive surveys of the area. For several years, we advocated the need for texts that focus on the conceptual content of complexity theory, but it seems that the only way of getting them written is to write them. Put in other words, our feeling is that the advocacy for a conceptual perspective of complexity theory will be more effective if backed by a detailed textbook-level exposition of the material. This is indeed the motivation for the current book and its main governing principle.

This book offers a conceptual perspective on complexity theory, and the presentation is designed to highlight this perspective. It is intended mainly for students that wish to learn complexity theory and for educators that intend to teach a course on complexity theory. The book is also intended to promote interest in complexity theory and make it acccessible to general readers with adequate background (which is mainly being comfortable with abstract discussions, definitions and proofs). We expect most readers to have a basic knowledge of algorithms, or at least be fairly comfortable with the notion of an algorithm.

The conceptual perspectives offered in this book are quite straightfoward (i.e., they follow directly from known questions and studies in the field). Indeed, the fundamental nature and conceptual content of the questions and studies conducted in complexity theory is evident to anybody who stops for a moment to think about it. The problem, however, is that too few researchers stop to reflect about this content, let alone communicate it to their students. The annoying (and quite amazing) consequences are students that have only a vague understanding of the meaning of these fundamental notions and results. In our opinion, it all boils down to taking the time to explicitly discuss the meaning of definitions and results. A related issue is using the ``right'' definitions (i.e., those that reflect better the fundamental nature of the notion being defined) and teaching things in the (conceptually) ``right'' order.

This book focuses on several sub-areas of complexity theory (see the following organization and chapter summaries). In each case, the exposition starts from the intuitive questions addresses by the sub-area, as embodied in the concepts that it studies. The exposition discusses the fundamental importance of these questions, the choices made in the actual formulation of these questions and notions, the approaches that underly the answers, and the ideas that are embedded in these answers. Our view is that these (``non-technical'') aspects are the core of the field, and the presentation attempts to reflect this view.

We note that being guided by the conceptual contents of the material leads, in some cases, to technical simplifications. Indeed, for many of the results presented in this book, the presentation of the proof is different (and arguablly easier to understand) than the standard presentations.

Table of contents (tentative, Jan. 2006)

  1. Introduction and Preliminaries
  2. P, NP and NP-Completeness
  3. Variations on P and NP
  4. More Resources, More Power?
  5. Space Complexity
  6. Randomness and Counting
  7. The Bright Side of Hardness
  8. Pseudorandom Generators
  9. Probabilistic Proof Systems
  10. Relaxing the Requirements Epilogue
    Apdx A - Glossary of Complexity Classes
      A.1 Introduction; A.2 Algorithm-based classes [Time complexity classes, Space complexity]; A.3 Circuit-based classes.
    Apdx B - On the Quest for Lower Bounds
      B.1 Preliminaries; B.2 Boolean Circuit Complexity [Basic Results and Questions, Monotone Circuits, Bounded-Depth Circuits, Formula Size]; B.3 Arithmetic Circuits [Univariate Polynomials, Multivariate Polynomials]; B.4 Proof Complexity [Logical Proof Systems, Algebraic Proof Systems, Geometric Proof Systems].
    Apdx C - On the Foundations of Modern Cryptography
      C.1 Introduction and Preliminaries; C.2 Computational Difficulty; C.3 Pseudorandomness [including Pseudorandom Functions]; C.4 Zero-Knowledge [The Simulation Paradigm, The Actual Definition, A construction and a generic application, Variants and Issues]; C.5 Encryption Schemes [Definitions, Constructions, Beyond Eavesdropping Security]; C.6 Signatures and Message Authentication [Definitions, Constructions, Public-Key Infrastructure]; C.7 General Cryptographic Protocols [The Definitional Approach and Some Models, Some Known Results, Construction Paradigms and Two Simple Protocols, Concluding Remarks].
    Apdx D - Probabilistic Preliminaries and Advanced Topics in Randomization
      D.1 Probabilistic preliminaries [Notational Conventions, Three Inequalities]; D.2 Hashing [Definitions, Constructions, The Leftover Hash Lemma]; D.3 Sampling; D.4 Randomness Extractors [The Main Definition, Extractors as averaging samplers, Extractors as randomness-efficient error-reductions, Other perspectives, Constructions].
    Apdx E - Explicit Constructions
      E.1 Error Correcting Codes [A mildly explicit version of the existential proof, The Hadamard Code, The Reed-Solomon Code, The Reed-Muller Code, Binary codes of constant relative distance and constant rate, Two additional computational problems, A list decoding bound]; E.2 Expander Graphs [Two Mathematical Definitions, Two levels of explicitness, Two properties, The Margulis-Gabber-Galil Expander, The Iterated Zig-Zag Construction].
    Apdx F - Some Omitted Proofs Apdx G - Some Computational Problems
      G.1 Graphs G.2 Formulae G.3 Algebra G.4 Number Theory

Related Material Available on-line

Extracts from the preliminary draft of the book are available as the numbered texts at my webpage various expositions (1987-2006), which contains also additional expositions.
The current book superseed previous sets of lecture notes (1999 and 2002).

Additional closely related material includes

©Copyright 2006 by Oded Goldreich.
Permission to make copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that new copies bear this notice and the full citation on the first page. Abstracting with credit is permitted.

Back to Oded Goldreich's homepage.