Models from Bayesian nonparametrics allow us to capture data with unknown latent structure by imagining an infinity of latent traits in the prior, and letting the data tell us how many have actually been observed. However, we cannot tractably learn or even simulate a model with infinitely many latent parameters in practice. Modern posterior inference techniques, such as variational Bayesian inference, typically overcome this via the use of a finite approximation to the infinite model. However, numerous such finite approximations often exist for each model, and it is not known how their errors compare. Furthermore, it is as of yet unknown how to select the size of the approximation in a principled manner. In this project, we aim to develop, characterize, and compare finite approximations to (normalized) completely random measures, a class of priors in Bayesian nonparametrics that generalizes the popular Dirichlet, beta, and gamma processes. This work will provide researchers with a standardized methodology for generating and analyzing finite approximations in Bayesian nonparametrics, and will be useful in the development of automated probabilistic programming inference techniques.
If you would like to contact us about our work, please scroll down to the people section and click on one of the group leads' people pages, where you can reach out to them directly.