期刊论文详细信息
JOURNAL OF MULTIVARIATE ANALYSIS 卷:171
Fixed support positive-definite modification of covariance matrix estimators via linear shrinkage
Article
Choi, Young-Geun1  Lim, Johan2  Roy, Anindya3  Park, Junyong3 
[1] SK Telecom, Data R&D Ctr, 65 Eulji Ro, Seoul 04539, South Korea
[2] Seoul Natl Univ, Dept Stat, 1 Gwanak Ro, Seoul 08826, South Korea
[3] Univ Maryland Baltimore Cty, Dept Math & Stat, 1100 Hilltop Circle, Baltimore, MD 21250 USA
关键词: Covariance matrix;    High-dimensional estimation;    Linear minimax classification problem;    Linear shrinkage;    Portfolio optimization;    Positive definiteness;    Precision matrix;   
DOI  :  10.1016/j.jmva.2018.12.002
来源: Elsevier
PDF
【 摘 要 】

This paper is concerned with the positive definiteness (PDness) problem in covariance matrix estimation. For high-dimensional data, many regularized estimators have been proposed under structural assumptions on the true covariance matrix, including sparsity. They were shown to be asymptotically consistent and rate-optimal in estimating the true covariance matrix and its structure. However, many of them do not take into account the PDness of the estimator and produce a non-PD estimate. To achieve PDness, researchers considered additional regularizations (or constraints) on eigenvalues, which make both the asymptotic analysis and computation much harder. In this paper, we propose a simple modification of the regularized covariance matrix estimator to make it PD while preserving the support. We revisit the idea of linear shrinkage and propose to take a convex combination between the first-stage estimator (the regularized covariance matrix without PDness) and a given form of diagonal matrix. The proposed modification, which we call the FSPD (Fixed Support and Positive Definiteness) estimator, is shown to preserve the asymptotic properties of the first-stage estimator if the shrinkage parameters are carefully selected. It has a closed form expression and its computation is optimization-free, unlike existing PD sparse estimators. In addition, the FSPD is generic in the sense that it can be applied to any non-PD matrix, including the precision matrix. The FSPD estimator is numerically compared with other sparse PD estimators to understand its finite-sample properties as well as its computational gain. It is also applied to two multivariate procedures relying on the covariance matrix estimator - the linear minimax classification problem and the Markowitz portfolio optimization problem- and is shown to improve substantially the performance of both procedures. (C) 2018 Elsevier Inc. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_jmva_2018_12_002.pdf 475KB PDF download
  文献评价指标  
  下载次数:2次 浏览次数:1次