Computer science has many layers of complexity that have yet to be discovered, much less mathematically proved. Understanding what is easy and what is hard to compute can help us answer questions about the power of computing itself — we know a lot about what computers can currently do, but have little idea of what they can’t do. For example, certain problems that are difficult for computers to solve today may be easy to solve tomorrow, if computer scientists could only find the right program; alternately, the problems could be fundamentally difficult. Until we have a better understanding of the reasoning behind computation, we just won’t know for sure.
Ryan Williams, Professor of Electrical Engineering and Computer Science of MIT CSAIL and EECS, wants to uncover and develop new forms of reasoning to begin truly understanding computation. He is interested in the connections between algorithms design and complexity theory, and works with other researchers and students in the Algorithms and Complexity Theory groups to find new algorithms and limitations find new algorithms and limitations in computing. Professor Williams previously taught at Stanford and earned a PhD from Carnegie Mellon. Throughout his career and through his research at MIT, he has taken key steps toward solving some of the biggest problems in theoretical computer science, including the famous P vs. NP problem, a problem about the limits of computation that we still don’t know much about.
Prof. Williams’s overall research vision is to understand the true power of algorithms by uncovering their limitations. These limitations are called complexity lower bounds. Questions about lower bounds seem very fundamental to the way in which we think. With P vs. NP, for example, NP is the class of problems whose solutions can be quickly recognized, while P is the class of problems whose solutions can be quickly generated. Intuition says that P ≠ NP (that quality recognition does not imply quality generation), because if P = NP, that would strongly suggest computers are far more powerful than we expect. But so far, by all measures, mathematics is extraordinarily far from proving P ≠ NP.
Lower bounds mathematically prove limitations on the power of algorithms, but we don’t know much about them compared to our knowledge of efficient algorithms. Prof. Williams considers the problem of proving lower bounds to be among the greatest scientific mysteries of our time. There is so little known about what is impossible, and by proving that no clever algorithm can be used to solve certain problems, we can obtain a deeper understanding of computing and a sturdier foundation for an information society.
Among his research groups’ current work to contribute to this understanding are projects in fine-grained complexity problems, the minimum circuit size problem, and hardness magnification. Fine-grained complexity studies problems that, in theory, can be solved efficiently, but in practice, the algorithms for the problem still take too long, because the size of inputs used is enormous.
The minimum circuit size problem (MCSP) has been studied for decades, but many questions about its complexity are still open. MCSP is a “meta-computational” problem in that it is a computational problem about computation itself: Imagine a computer is given a table of inputs and outputs to a function, and the computer’s goal is to determine if there is an efficient program (a circuit) that implements the function. Is there a computer that can always achieve this goal efficiently, no matter what function is given? That is the mystery of MCSP, and it is a major open problem.
Potentially, proving limitations (lower bounds) on computers and gaining a full understanding of computing will help us eventually know precisely how much time, space, and resources are required to solve given problems in the real world. In the future, this means we can go confidently forward knowing that the program we are running is optimal, with the best possible algorithm at any point in time. Such deep knowledge would have a major impact on how we think and use computing in the long term, such as significant applications to cryptography. (For example, “proofs of work” in blockchain technologies require that we have some problem that needs at least a certain amount of resources to be solved.)