**Whitney Award for Research**

**Jeffrey Gory, PhD Candidate in Statistics**

### Marginal Inference in Generalized Linear Mixed Models

A popular approach for relating correlated measurements of a non-Gaussian response variable to a set of predictors is to introduce latent random variables and fit a generalized linear mixed model. The conventional strategy for specifying such a model leads to parameter estimates that must be interpreted conditional on the latent variables. In many cases, interest lies not in these conditional parameters, but rather in marginal parameters that summarize the average effect of the predictors across the entire population. Due to the structure of the generalized linear mixed model, the average effect across all individuals in a population is generally not the same as the effect for an average individual. I will discuss why this is the case and introduce a class of marginally interpretable generalized linear mixed models that lead to parameter estimates with the desired interpretation. I will also address the impact that the choice of model parameterization has on inference for the fixed effects parameters in these models.

**Anna Smith, PhD Candidate in Statistics**

### A Bayesian Hyperbolic Latent Distance Model for Network Data

Latent space models are a popular framework for modeling network data where network dyadic ties are assumed to be conditionally independent given distances between nodes’ unobserved positions in a latent space. Typically, the latent space is taken to be Euclidean; however, extensions to other spaces (e.g., elliptic and ultrametric space) have been considered by various authors. In this talk, we present a new version of this modeling framework that embeds nodes in a latent hyperbolic space. As others have demonstrated in different settings, hyperbolic embeddings of networks can provide an intuitive way of analyzing network data, primarily because the behavior of distances in negatively-curved hyperbolic space mimics the behavior of distances (shortest path lengths) in a network. We consider this idea in a formal (parametric) statistical modeling framework, so that our proposed Bayesian hyperbolic latent distance model is fully equipped to allow formal statistical inference on unknown model parameters that describe network structural features. Model fitting via adaptive MCMC is briefly discussed, as is an extension to the hierarchical setting where multiple networks are observed and can be regarded as realizations from a single data generating process. We illustrate the potential of this hierarchical extension by comparing activity pattern networks constructed from the Los Angeles Family and Neighborhood Survey (L.A.FANS) data across sampled census tracts.

**Cooley Memorial Prize**

**Andrew Bean, PhD Candidate in Statistics**

### Transformations and Bayesian Density Estimation

We present a method for Bayesian density estimation when the true distribution may be skewed or heavy-tailed. For estimating a continuous density, the most popular priors, based on countable mixtures of normals, perform well when the true distribution is light-tailed, but skew and heavy tails lead to lower-quality inference. We adopt a strategy of transformation-density estimation: we first estimate a parametric transformation with the aim of symmetrizing and shortening the tails of the sample, then use a standard Bayesian mixture model to estimate the density on the transformed scale. Empirical evidence from simulations shows this basic strategy leads to improved inference on the original scale. We discuss the impact of this technique on asymptotic behavior of the posterior distributions under sampling from a broad class of heavy-tailed densities.