
Dissertation Defense:
Eunseop Kim, Statistics Ph.D. Candidate
Title: Extensions of Empirical Likelihood for Multiple Testing and Bayesian Inference
Abstract:
Statistical models defined through estimating equations or moment conditions have gained popularity over the past few decades due to their ability to enable inference on targeted quantities without the restrictive assumptions routinely made in parametric models. Within the context of estimating equations, empirical likelihood has become a popular nonparametric likelihood that enables a likelihood-driven style of inference, extending the use of likelihood from heavily structured parametric problems to those with minimal restrictions. Empirical likelihood exhibits many desirable properties of parametric likelihood, including Wilk's phenomenon and Bartlett correction. Additionally, confidence regions from empirical likelihood have data-driven shapes and orientations.
This dissertation focuses on two extensions of empirical likelihood methods. The first extension involves expanding the scope of empirical likelihood to multiple hypothesis testing. Based on a computational strategy for hypothesis testing with empirical likelihood, we develop a framework for applying empirical likelihood to the analysis of designed experiments, addressing issues that arise from blocking and multiple comparisons. Technical results identify an appropriate limiting distribution for a set of comparisons of interest. We propose two single-step multiple testing procedures: asymptotic Monte Carlo and nonparametric bootstrap, both of which asymptotically control the generalized family-wise error rate and construct simultaneous confidence intervals for comparisons of interest without explicitly considering the underlying covariance structure. A simulation study and an application to experiments demonstrate that the performance of the procedures is robust to violations of standard assumptions for designed experiments.
The second extension focuses on Bayesian inference with empirical likelihood. We propose a special type of empirical likelihood suitable for Bayesian inference called regularized exponentially tilted empirical likelihood. This extension relies on a regularization technique, which expands the posterior domain to the entire parameter space. We provide two perspectives on the regularization: as a limiting procedure of adding pseudo data and as exponential tilting to the mixture of discrete and continuous distributions. We demonstrate that our proposal retains desirable properties of empirical likelihood and show, through simulation and data analysis, that it is a suitable pseudo-likelihood for Bayesian inference.