期刊论文详细信息
JOURNAL OF MULTIVARIATE ANALYSIS 卷:186
Shrinkage estimation of large covariance matrices: Keep it simple, statistician?
Article
Ledoit, Olivier1,2  Wolf, Michael1 
[1] Univ Zurich, Dept Econ, Zurich, Switzerland
[2] AlphaCrest Capital Management, New York, NY USA
关键词: Large-dimensional asymptotics;    Random matrix theory;    Rotation equivariance;   
DOI  :  10.1016/j.jmva.2021.104796
来源: Elsevier
PDF
【 摘 要 】

Under rotation-equivariant decision theory, sample covariance matrix eigenvalues can be optimally shrunk by recombining sample eigenvectors with a (potentially nonlinear) function of the unobservable population covariance matrix. The optimal shape of this function reflects the loss/risk that is to be minimized. We solve the problem of optimal covariance matrix estimation under a variety of loss functions motivated by statistical precedent, probability theory, and differential geometry. A key ingredient of our nonlinear shrinkage methodology is a new estimator of the angle between sample and population eigenvectors, without making strong assumptions on the population eigenvalues. We also introduce a broad family of covariance matrix estimators that can handle all regular functional transformations of the population covariance matrix under large-dimensional asymptotics. In addition, we compare via Monte Carlo simulations our methodology to two simpler ones from the literature, linear shrinkage and shrinkage based on the spiked covariance model. (C) 2021 The Author(s). Published by Elsevier Inc.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_jmva_2021_104796.pdf 915KB PDF download
  文献评价指标  
  下载次数:3次 浏览次数:0次