The Stein paradox has played an influential role in the field of high dimensional statistics. This result warns that the sample mean, classically regarded as the “usual estimator”, may be suboptimal in high dimensions. The development of the James-Stein estimator, that addresses this paradox, has by now inspired a large literature on the theme of “shrinkage” in statistics. In this direction, we develop a James-Stein type estimator for the first principal component of a high dimension and low sample size data set. This estimator shrinks the usual estimator, an eigenvector of a sample covariance matrix under a spiked covariance model, and yields superior asymptotic guarantees. Our derivation draws a close connection to the original James-Stein formula so that the motivation and recipe for shrinkage is intuited in a natural way.