Ohio State nav bar

Seminar Series: Debdeep Pati

Debdeep Pati
March 11, 2021
3:00PM - 4:00PM
Virtual Meeting

Date Range
Add to Calendar 2021-03-11 15:00:00 2021-03-11 16:00:00 Seminar Series: Debdeep Pati Title Variational inference: recent theoretical developments Meeting Link Speaker Debdeep Pati - Texas A&M University, Department of Statistics Abstract In the first part of the talk, I shall discuss statistical properties of variational inference, an approximate method for posterior computation in Bayesian models.   A novel class of variational inequalities are developed for linking the Bayes risk under the variational approximation to the objective function in the variational optimization problem, implying that maximizing the evidence lower bound in variational inference has the effect of minimizing the Bayes risk within the variational density family. Operating in a frequentist setup, the variational inequalities imply that point estimates constructed from the procedure converge at an optimal rate to the true parameter in a wide range of problems. We illustrate our general theory with a number of examples, including the mean-field inference in high-dimensional linear regression, latent variable models including mixture of Gaussian, latent Dirichlet allocation and minorization based variational inference in non-conjugate models.     In the second part, I shall discuss computational guarantees of variational inference, albeit in specialised examples such as  singular models and regularized mean-field variational inference.   I'll conclude the talk with some open problems. Virtual Meeting Department of Statistics stat@osu.edu America/New_York public

Title

Variational inference: recent theoretical developments

Meeting Link

Speaker

Debdeep Pati - Texas A&M University, Department of Statistics

Abstract

In the first part of the talk, I shall discuss statistical properties of variational inference, an approximate method for posterior computation in Bayesian models.   A novel class of variational inequalities are developed for linking the Bayes risk under the variational approximation to the objective function in the variational optimization problem, implying that maximizing the evidence lower bound in variational inference has the effect of minimizing the Bayes risk within the variational density family. Operating in a frequentist setup, the variational inequalities imply that point estimates constructed from the procedure converge at an optimal rate to the true parameter in a wide range of problems. We illustrate our general theory with a number of examples, including the mean-field inference in high-dimensional linear regression, latent variable models including mixture of Gaussian, latent Dirichlet allocation and minorization based variational inference in non-conjugate models.  

 

In the second part, I shall discuss computational guarantees of variational inference, albeit in specialised examples such as  singular models and regularized mean-field variational inference.   I'll conclude the talk with some open problems.