Seminar Series: Annie Sauer

annie
November 17, 2022
3:00PM - 4:00PM
EA170

Date Range
2022-11-17 15:00:00 2022-11-17 16:00:00 Seminar Series: Annie Sauer Speaker: Annie Sauer, PhD Candidate in Department of Statistics, Virginia Polytechnic Institute and State University (Virginia Tech) Title: Deep Gaussian Process Surrogates for Computer Experiments   Abstract:  Deep Gaussian processes (DGPs) upgrade ordinary GPs through functional composition, in which intermediate GP layers warp the original inputs, providing flexibility to model non-stationary dynamics.  Recent applications in machine learning favor approximate, optimization-based inference, but applications to computer surrogate modeling demand broader uncertainty quantification (UQ).  We prioritize UQ for DGPs through full posterior integration in a Bayesian scheme, hinging on elliptical slice sampling of latent layers.  We demonstrate how our DGP's non-stationary flexibility, combined with appropriate UQ, allows for active learning — a virtuous cycle of data acquisition and model updating — that departs from traditional space-filling design and yields more accurate surrogates for fixed simulation effort.  But not all simulation campaigns can be developed sequentially, and many existing computer experiments are simply too big for full DGP posterior integration due to cubic scaling bottlenecks. For this case we introduce the Vecchia approximation, popular for ordinary GPs in spatial data settings.  We show that Vecchia-induced sparsity of Cholesky factors allows for linear computational scaling without compromising DGP accuracy or UQ.  We showcase implementation in the "deepgp" package for R on CRAN.   Note: Seminars are free and open to the public. Reception to follow.   EA170 America/New_York public

Speaker: Annie Sauer, PhD Candidate in Department of Statistics, Virginia Polytechnic Institute and State University (Virginia Tech)

Title: Deep Gaussian Process Surrogates for Computer Experiments

 

Abstract: 

Deep Gaussian processes (DGPs) upgrade ordinary GPs through functional composition, in which intermediate GP layers warp the original inputs, providing flexibility to model non-stationary dynamics.  Recent applications in machine learning favor approximate, optimization-based inference, but applications to computer surrogate modeling demand broader uncertainty quantification (UQ).  We prioritize UQ for DGPs through full posterior integration in a Bayesian scheme, hinging on elliptical slice sampling of latent layers.  We demonstrate how our DGP's non-stationary flexibility, combined with appropriate UQ, allows for active learning — a virtuous cycle of data acquisition and model updating — that departs from traditional space-filling design and yields more accurate surrogates for fixed simulation effort.  But not all simulation campaigns can be developed sequentially, and many existing computer experiments are simply too big for full DGP posterior integration due to cubic scaling bottlenecks. For this case we introduce the Vecchia approximation, popular for ordinary GPs in spatial data settings.  We show that Vecchia-induced sparsity of Cholesky factors allows for linear computational scaling without compromising DGP accuracy or UQ.  We showcase implementation in the "deepgp" package for R on CRAN.

 

Note: Seminars are free and open to the public. Reception to follow.