To obtain scalable Bayesian inference methods, we develop algorithms to create compact “summaries” of large quantities of data. We can then quickly run standard inference algorithms on these summaries without needing to look at the whole dataset.
Our goal is to enable scalable and accurate Bayesian inference for rich probabilistic models by applying optimization techniques. We also aim to understand the connections between the two approaches of statistical inference: Bayesian and frequentist.
This CoR takes a unified approach to cover the full range of research areas required for success in artificial intelligence, including hardware, foundations, software systems, and applications.