SEM217: Youhong Lee, UC Santa Barbara: Regularized Estimators in High Dimensional PCA

Tuesday, April 11th @ 11:00-12:30 PM (RECORDING)

The idea of regularization that combines a simply structured target with classical estimators is popular in high-dimensional data analysis. We propose a new regularization method and its fast machine learning algorithm called direction-regularized principal component analysis (drPCA). The regularization method solves the PCA problem that seeks the direction of maximum variance of the data subject to some prior target direction. An asymptotic analysis of the solution by the high-dimensional low-sample size framework gives an optimal tuning parameter that minimizes an asymptotic loss function, and the data quickly learns the corresponding estimator to the optimal tuning parameter. We can also show that our estimator is equivalent to the Ledoit-Wolf constant correlation shrinkage estimator and a recently proposed James-Stein estimator for the first principal component under some specific covariance structures.