
Title
Dimension Reduction in Regression
Speaker
Shaoli Wang, Penn State University
Abstract
Recent advances in sciences and computer technology increasingly demand the processing of high dimensional data. Dimension reduction is a body of theories and methods that are developed to meet such demands. In the context of regression and classification, dimension reduction means to reduce the dimension of predictors without loss of information on the relation between predictors and response. Results obtained in the past decade or so have indicated that dimension reduction can be particularly useful during the model building phase as it usually does not require a pre-specified parametric model between predictors and response. Two assumptions are usually made on the marginal distribution of predictors: a linearity condition and a constant covariance condition, which can be too restrictive for some applications. A new iteration method, Iterative SAVE Transformation (IST), is proposed to estimate and infer about the dimension reduction subspace. The new method only requires the linearity condition, but can estimate more directions in the dimension reduction subspace. Asymptotic results are derived for IST, based on which a testing procedure is introduced for estimating the order of the dimension reduction subspace. A further exploration of the eigenvalue and eigenvector structure of iteration matrices reveals the mechanism that makes iteration methods work. A simulation study comparing IST to existing methods illustrates its advantage. The method is applied to an ozone data set.