Ohio State is in the process of revising websites and program materials to accurately reflect compliance with the law. While this work occurs, language referencing protected class status or other activities prohibited by Ohio Senate Bill 1 may still appear in some places. However, all programs and activities are being administered in compliance with federal and state law.

Student Awards

leaf
April 18, 2023
3:00 pm - 4:00 pm
209 W Eighteenth Ave (EA) Room 170

 

Speaker: Eunseop Kim

Title: Regularized exponentially tilted empirical likelihood for Bayesian inference

Abstract: Bayesian inference with empirical likelihood faces a challenge where the posterior domain is a proper subset of the original parameter space due to the convex hull constraint with finite data. To address this issue, we propose regularized exponentially tilted empirical likelihood. Our method removes the convex hull constraint using a novel regularization technique, incorporating a continuous exponential family distribution to satisfy a Kullback-Leibler divergence criterion. The regularization arises as a limiting procedure where pseudo-data are added to the formulation of exponentially tilted empirical likelihood in a disciplined way. We show that regularized exponentially tilted empirical likelihood retains desirable asymptotic properties of (exponentially tilted)  empirical likelihood. Simulation and data analysis demonstrate that the proposed method provides a suitable pseudo-likelihood for Bayesian inference.

 

 

Speaker: Yoonji Kim

Title: Sequential Bayesian Registration for Functional Data

Abstract: In many modern applications, discretely-observed data may be naturally understood as a set of functions. Functional data often exhibit two confounded sources of variability: amplitude (y-axis) and phase (x-axis). The extraction of amplitude and phase, a process known as registration, is essential in exploring the underlying structure of functional data in a variety of areas, from environmental monitoring to medical imaging. Critically, such data are often gathered sequentially with new functional observations arriving over time. Despite this, most available registration procedures are only applicable to batch learning, leading to inefficient computation. To address these challenges, we introduce a Bayesian framework for sequential registration of functional data, which updates statistical inference as new sets of functions are assimilated. This Bayesian model-based sequential learning approach utilizes sequential Monte Carlo sampling to recursively update the alignment of observed functions while accounting for associated uncertainty. As a result, distributed computing, which is not generally an option in batch learning, significantly reduces computational cost. Simulation studies and comparisons to existing batch learning methods reveal that the proposed approach performs well even when the target posterior distribution has a challenging structure.