Features

Understanding Randomness By Davin Wilfrid | ||

Is solving a problem more difficult than
verifying a solution? Can any efficient process be efficiently
reversed? Can you infer a global property of an object by
inspecting a tiny portion of it?
At the Radcliffe Institute this year, the six-member research cluster on randomness and computation is searching for answers to these and related abstract questions. Some of the questions have such appealing, everyday parallels that the answers may seem obvious. We all know that solving a crossword puzzle is more difficult than checking a completed one. But when the questions are asked about computation in general, the answers have eluded computer scientists and mathematicians for decades. “It’s like being a physicist in the days of Newton or perhaps even before, when most of the discoveries are yet to be made,” says Eli Ben-Sasson, a member of the cluster. “For almost every question you ask, the answer is unknown.” The group, which was brought together by Salil Vadhan ’95, the cluster chair and an assistant professor of computer science at Harvard University, includes some of the world’s leading theoretical computer scientists. Besides Vadhan and Ben-Sasson, a former postdoctoral fellow in the Division of Engineering and Applied Sciences at Harvard and a postdoctoral associate in the MIT Laboratory for Computer Science, the group includes Oded Goldreich, Dana Ron BI ’98, Ronitt Rubinfeld, and Madhu Sudan. Irit Dinur, a Miller Fellow in computer science at the University of California at Berkeley, visited the group once in the fall and will return in the spring. The cluster studies the interplay between randomness and computation, an area of computer science that has become a prime research topic in recent years. The group’s efforts will improve computer security and lead to other practical applications, but cluster members are more concerned with contributing to mankind’s understanding of the power of computation itself. They hope that unlocking the secrets of randomness and computation will lead them one step closer to understanding the Theory of Computation, the governing theory of computer science. “Some people are driven by the beauty of the problems and their solutions, and you can never tell when the work they do will turn out to be useful,” says Barbara J. Grosz, dean of science at the Radcliffe Institute and the Higgins Professor of Natural Science at Harvard. Because questions of randomness have recently emerged as a focal point of computer science, Grosz says that choosing this particular cluster of renowned researchers was “obviously something we should do.” She is confident that the combination of researchers is a good one, and that the cluster’s interactions will prove successful. “Computer scientists are grappling with many fundamental questions about the role of randomness in computation,” Grosz says, “so bringing together some of the world’s leading people in this area and enabling them to bounce ideas off one another presents an excellent opportunity for Radcliffe to advance science at its frontiers and to contribute to Harvard's scientific community.” To understand, by way of analogy, how randomness and computation can interact, imagine that one of the world’s great chefs has given you his coveted recipe for coq au vin, only the recipe is 9,000 pages long and takes seven months to prepare. Now imagine that there was a way to recreate the dish almost perfectly, in a reasonable amount of time, using randomly selected bits of the original recipe condensed into one user-friendly page. You would probably jump at the chance--and your dinner guests would be none the wiser. Similarly, the computer science cluster hopes to use randomness to design efficient algorithms (think: recipes for your computer) for complex computational tasks. Because computers are constrained by time and memory, computer scientists measure the value of a given computation by its efficiency. Cluster members believe they can design “randomized” algorithms that offer reasonably close approximations of solutions to problems too large and complex to solve in their entirety, much like the condensed recipe gives us an almost perfect dish. Algorithms that make effective decisions based on small samples of large objects, which are called sublinear-time algorithms, can add enormous power to the computing process. Ron, a faculty member in the department of electrical engineering at Tel Aviv University, presented the cluster’s goals in this area in a November seminar titled “On the Difficulty of Being Perfect and the Ease of Being Almost Perfect.” Her own research focuses on such algorithmic issues as using randomized algorithms to infer properties of large, complex objects, and applying them to other areas within computer science, including graph theory, coding theory, pattern recognition, and clustering. Ron is also interested in testing the limits of randomness; clarifying exactly which types of problems can be solved using randomness and which cannot. “There’s something really great in having these people together,” says Ron, whose research overlaps with that of other cluster members in an expanding web of interconnected ideas. Sudan, a professor of computer science at MIT, echoes Ron’s excitement about the “dream group” at Putnam House. “I’m really thrilled to be with this group,” says Sudan, who has known other cluster members for as long as ten years but sees the Radcliffe Institute as a unique opportunity for collaboration. Sudan’s primary focus is on using randomness to facilitate faster error-correction procedures in computers. As anyone who has ever de-fragmented a hard drive knows, error correction can be frustratingly long, sometimes lasting for hours as the computer combs through every bit of information, looking for and correcting each error. Sudan hopes that by “randomizing” the process--instructing a computer to check only a random sample of the drive instead of trolling through the entire volume--a user could get a reasonable estimate of a system’s integrity without sacrificing long hours to the process. The relationship between sublinear-time algorithms and error- correction codes may seem vague, but the fact that the two are closely connected at a fundamental, theoretical level confirms the importance of the cluster’s research within the field. “Randomness has permeated the entire field of computer science,” says Vadhan. Seemingly unrelated theoretical questions have merged under close inspection, revealing that many questions are, at their core, exactly the same. Such discoveries are encouraging to the group because they offer further proof that randomness is fundamental to the Theory of Computation. Vadhan’s focus is pseudorandomness, which he defines as the ability of computers to generate information that “looks” random to another computer but is actually constructed with little or no randomness. The development of pseudorandomness generators would be a boon to the computer security industry, giving security specialists a way to mask transmitted information--such as your credit card number--in the guise of totally random numbers. Vadhan hopes that his research, besides shielding sensitive information, will lead to a greater understanding of whether randomized computations can be more efficient than straightforward, deterministic ones. Goldreich, a professor of computer science and the Meyer W. Weisgal Professional Chair at the Weizmann Institute of Science in Israel, is similarly intrigued by pseudorandomness and its applications to cryptography, but is quick to emphasize that his primary interest lies in pondering the fundamental questions underlying such applications. “A good scientist has to be philosophically inclined,” he says, since much of the success of theoretical study comes from “finding the right way to ask a particular question.” Apart from pseudorandomness, Goldreich studies various notions of probabilistically checkable proofs--those that are too long and complex to be validated in full but can be encrypted and randomized, allowing a computer to make a reasonable estimate of the proof’s soundness. Joining the cluster for the spring semester is Rubinfeld, who was formerly a senior research scientist at NEC Laboratories in New Jersey and will join the MIT faculty following her Radcliffe fellowship. Her main interests are in studying sublinear-time algorithms, property testing, proof systems, and coding theory. Rubinfeld will join a group that has already made an enormous contribution to the Radcliffe community, says Barbara Grosz. “I’m absolutely thrilled with the interactions of people within the cluster with Radcliffe at large,” she says, emphasizing the group’s ability to connect with researchers in other disciplines. The group’s November presentation was well attended by Radcliffe fellows representing many disciplines, as was a lecture by Avi Wigderson of the Institute for Advanced Study at Princeton, who in early October began the 2003–2004 Radcliffe Lectures in the Sciences with a talk on randomness. Shafrira Goldwasser, a professor at MIT, will close the series in April. Are the women and men batting ideas around Putnam House the Isaac Newtons of computer science? Only future generations of computer scientists will be able to answer that question in full, but the cluster’s contributions to the Radcliffe Institute have been both immediate and impressive. “The science is terrific, they’re getting work done, and their seminars and research are involving students and other faculty from Harvard and MIT,” Grosz says. “At other levels, they’re interacting with Radcliffe fellows across many disciplines. What more could you ask from a cluster?”
continue with... |