Tuesday, November 8th @ 11:00-12:30 PM (ONLINE)
We consider the problem of testing linear hypotheses associated with a multivariate linear regression model. Classical tests for this type of hypotheses based on the likelihood ratio statistics suffer from substantial loss of power when the dimensionality of the observations is comparable to the sample size. To mitigate this problem, we propose two different classes of regularized test procedures that rely on a nonlinear shrinkage of the eigenvalues, and possibly eigenprojections, of the estimated noise covariance matrix. The first approach utilizes a ridge-type shrinkage, while the second works under the structural assumption that the population noise covariance matrix has a spiked eigenvalue structure. We address the problem of finding the optimal regularization parameter in each case by making use of decision-theoretic principles.