Ohio State is in the process of revising websites and program materials to accurately reflect compliance with the law. While this work occurs, language referencing protected class status or other activities prohibited by Ohio Senate Bill 1 may still appear in some places. However, all programs and activities are being administered in compliance with federal and state law.

Seminar: Hua Zhou

Statistics Seminar
February 2, 2010
All Day
209 W. Eighteenth Ave. (EA), Room 170

Title

Optimization in Some High-Dimensional Statistical Problems

Speaker

Hua Zhou, University of California, Los Angeles

Abstract

Modern high-throughput datasets from data mining, genomics, and imaging demand high-dimensional models with ten to hundreds of thousands of parameters. This poses challenges to the classical techniques of optimization such as Newton's method and Fisher's scoring algorithm, which involves storing, computing, and inverting large Hessian or information matrix. If parameter constraints and parameter bounds intrude, then the algorithms require further modification. Although numerical analysts have devised numerous remedies and safeguards, these all come at a cost of greater implementation complexity.

In this talk, I first describe the minorization-maximization (MM) principle which generalizes the celebrated expectation-maximization (EM) algorithm and offers a versatile weapon for attacking optimization problems of this sort. Then I present two devices for accelerating MM algorithms for high-dimensional problems: a quasi-Newton scheme and a parallel computing method based on inexpensive graphics processing units (GPUs). Both offer one to two orders of magnitude improvement over the naive algorithm. Applications include positron emission tomography, nonnegative matrix factorization, a movie rating algorithm and multidimensional scaling. Time permitting, I will also present several variations of deterministic annealing that tend to avoid inferior modes and find the dominant mode in some non-convex statistical problems.