Ohio State nav bar

Seminar: Merlise Clyde

Statistics Seminar Series
December 1, 2016
All Day
209 W. Eighteenth Ave. (EA), Room 170

Title

Scalability and Scale Mixtures of Normals in Generalized Linear Models

​Speaker

Merlise Clyde, Duke University

Abstract

Mixtures of Zellner's g-priors have been studied extensively in linear models and have been shown to have numerous desirable properties for Bayesian variable selection and model averaging. Several extensions of g-priors to Generalized Linear Models (GLMs) have been proposed in the literature; however, the choice of prior distribution of g and resulting properties for inference have received considerably less attention. We show how many of these priors may be unified through mixtures of $g$-priors in GLMs by assigning the truncated Compound Confluent Hypergeometric distribution to the shrinkage factor 1/(1+g). Special cases include the the hyper-g, Beta-prime, truncated Gamma, incomplete inverse-Gamma, benchmark, robust prior, hyper-g/n and intrinsic priors. Under an Integrated Laplace approximation, the posterior distribution of 1/(1+g) and marginal likelihoods are functions of these special hypergeometric functions leading to "Compound Hypergeometric Information Criteria'' for model selection. We discuss the local geometric properties of the priors in GLMs and show how desiderata for model selection proposed by Bayarri et al, such as asymptotic model selection consistency, intrinsic consistency, predictive matching, and measurement invariance may be used to justify the prior and choices of the hyper parameters, as well as limitations. If time permits, we discuss how these priors on shrinkage parameters relate to generalized ridge priors via data augmented.