Go to navigation (press enter key)

News

Colloquium, Richard Guo, MIT

Date: Tuesday, January 31, 2017

Title: Boosting Variational Inference

Abstract: Modern Bayesian inference typically requires some form of posterior approximation, and mean-field variational inference (MFVI) is an increasingly popular choice due to its speed. But MFVI can be inaccurate in various aspects, including an inability to capture multimodality in the posterior and underestimation of the posterior covariance. These issues arise since MFVI considers approximations to the posterior only in a family of factorized distributions. We instead consider a much more flexible approximating family consisting of all possible finite mixtures of a parametric base distribution (e.g., Gaussian). In order to efficiently find a high-quality posterior approximation within this family, we borrow ideas from gradient boosting and propose boosting variational inference (BVI). BVI iteratively improves the current approximation by mixing it with a new component from the base distribution family. We develop practical algorithms for BVI and demonstrate their performance on both real and simulated data. Joint work with Xiangyu Wang, Kai Fan, Tamara Broderick and David Dunson.

Location: Rockefeller Hall 312

Time: 12:30