Seminar Series: Alex Dombowsky

Alex Dombowsky
March 20, 2025
3:00PM - 4:00PM
EA 170

Date Range
2025-03-20 15:00:00 2025-03-20 16:00:00 Seminar Series: Alex Dombowsky Speaker: Alex DombowskyTitle: Robust Point Estimation in Bayesian ClusteringAbstract: Bayesian clustering typically relies on mixture models, with each component interpreted as a different cluster. After defining a prior for the component parameters and weights, Markov chain Monte Carlo (MCMC) algorithms are commonly used to produce samples from the posterior distribution of the component labels. The data are then clustered by minimizing the expectation of a clustering loss function that favors similarity to the component allocations. Unfortunately, although these approaches are routinely implemented, clustering results are highly sensitive to kernel misspecification. For example, if Gaussian kernels are used but the true density of data within a cluster is even slightly non-Gaussian, then clusters will be broken into multiple Gaussian components. To address this problem, we develop Fusing of Localized Densities (FOLD), a novel clustering method that melds components together using the posterior of the kernels. We accomplish this behavior by rewarding co-clustering two observations whose mixture kernels are close under a statistical distance. FOLD has a fully Bayesian decision theoretic justification, naturally leads to uncertainty quantification, and can be easily implemented as an add-on to MCMC algorithms for mixtures. In addition, we provide theoretical support for FOLD, including novel guarantees that show asymptotic clustering optimality under kernel misspecification. In an application to single-cell RNA sequencing, FOLD outperforms competitors by inferring a small number of meaningful clusters that correspond to known cell types. EA 170 America/New_York public

Speaker: Alex Dombowsky

Title: Robust Point Estimation in Bayesian Clustering

Abstract: Bayesian clustering typically relies on mixture models, with each component interpreted as a different cluster. After defining a prior for the component parameters and weights, Markov chain Monte Carlo (MCMC) algorithms are commonly used to produce samples from the posterior distribution of the component labels. The data are then clustered by minimizing the expectation of a clustering loss function that favors similarity to the component allocations. Unfortunately, although these approaches are routinely implemented, clustering results are highly sensitive to kernel misspecification. For example, if Gaussian kernels are used but the true density of data within a cluster is even slightly non-Gaussian, then clusters will be broken into multiple Gaussian components. To address this problem, we develop Fusing of Localized Densities (FOLD), a novel clustering method that melds components together using the posterior of the kernels. We accomplish this behavior by rewarding co-clustering two observations whose mixture kernels are close under a statistical distance. FOLD has a fully Bayesian decision theoretic justification, naturally leads to uncertainty quantification, and can be easily implemented as an add-on to MCMC algorithms for mixtures. In addition, we provide theoretical support for FOLD, including novel guarantees that show asymptotic clustering optimality under kernel misspecification. In an application to single-cell RNA sequencing, FOLD outperforms competitors by inferring a small number of meaningful clusters that correspond to known cell types.