This week's Statistics Student Seminar speaker will be Maximillian Chen.
Talk Title: Dimension Reduction and Inferential Procedures for Images
High-dimensional data analysis has been a prominent topic of statistical research in recent years due to the growing presence of high-dimensional electronic data. Much of the current work has been done on analyzing a sample of high-dimensional multivariate data. However, not as much research has been done on analyzing a sample of matrix-variate data. The population value decomposition (PVD), originated in Crainiceanu et al (2011), is a method for dimension reduction of a population of massive images. Images are decomposed into a product of two orthogonal matrices with population-specific features and one matrix with subject-specific features. The problems of finding the optimal row and column dimensions of reduction for the population of data matrices and inference in the PVD framework have yet to be solved. To find the optimal row and column dimensions, we base our methods on the low-rank approximation methods and optimization procedures of Manton et al (2003). In order to develop our inferential procedures, we assume our data to be matrix normally distributed. We introduce likelihood-ratio tests, score tests, and regression-based inferential procedures for the one, two, and k-sample problems and derive the distributions of the resulting test statistics. Practical implementation will be discussed.