Algorithmic Discrepancy Theory and Randomized Controlled Trials

Speaker

Daniel Spielman
Yale University

Host

Daniela Rus
MIT SCC & CSAIL
Abstract:
Theorems in discrepancy theory tell us that it is possible to partition a set of vectors into two sets that look surprisingly similar to each other. In particular, these sets can be much more similar to each other than those produced by a random partition. For many measures of similarity, computer scientists have developed algorithms that produce these partitions efficiently.

A natural application for these algorithms is the design of randomized controlled trials (RCTs). Randomized
Controlled Trials are used to test the effectiveness of interventions, like medical treatments and educational
innovations. Randomization is used to ensure that the test and control groups are probably similar. When we know nothing about the experimental subjects, a random partition into test and control groups is the best choice. When experimenters have measured quantities about the experimental subjects that they expect could influence a subject's response to a treatment, the experimenters try to ensure that these quantities are evenly distributed between the test and control groups. That is, they want a random partition of low discrepancy.

In this talk, I will survey some fundamental results in discrepancy theory, present a model for the analysis of RCTs, and summarize results from my joint work with Chris Harshaw, Fredrik Sävje, and Peng Zhang that uses algorithmic discrepancy theory to improve the design of randomized controlled trials.

Bio:
Daniel Alan Spielman is the Sterling Professor of Computer Science, and Professor of Statistics and Data Science, and of Mathematics at Yale. He received his B.A. in Mathematics and Computer Science from Yale in 1992, and his Ph.D in Applied Mathematics from M.I.T. in
1995. After spending a year as an NSF Mathematical Sciences Postdoctoral Fellow in the Computer Science Department at U.C. Berkeley, he became a professor in the Applied Mathematics Department at M.I.T. He moved to Yale in 2005.

He has received many awards, including the 1995 ACM Doctoral Dissertation Award, the 2002 IEEE Information Theory Paper Award, the 2008 and 2015 Godel Prizes, the 2009 Fulkerson Prize, the 2010 Nevanlinna Prize, the 2014 Polya Prize, the 2021 NAS Held Prize, the 2023 Breakthrough Prize in Mathematics, a Simons Investigator Award, and a MacArthur Fellowship. He is a Fellow of the Association for Computing Machinery and a member of the National Academy of Sciences, the American Academy of Arts and Sciences, and the Connecticut Academy of Science and Engineering.
His main research interests include the design and
analysis of algorithms, network science, machine learning, digital communications and scientific computing.