JOURNAL OF MULTIVARIATE ANALYSIS | 卷:170 |
Optimal shrinkage estimator for high-dimensional mean vector | |
Article | |
Bodnar, Taras1  Okhrin, Ostap2  Parolya, Nestor3  | |
[1] Stockholm Univ, Dept Math, SE-10691 Stockholm, Sweden | |
[2] Tech Univ Dresden, Chair Econometr & Stat Esp Transportat, D-01062 Dresden, Germany | |
[3] Leibniz Univ Hannover, Inst Stat, D-30167 Hannover, Germany | |
关键词: Large-dimensional asymptotics; Mean vector estimation; Random matrix theory; Shrinkage estimator; | |
DOI : 10.1016/j.jmva.2018.07.004 | |
来源: Elsevier | |
【 摘 要 】
In this paper we derive the optimal linear shrinkage estimator for the high-dimensional mean vector using random matrix theory. The results are obtained under the assumption that both the dimension p and the sample size n tend to infinity in such a way that p/n -> c is an element of(0, infinity). Under weak conditions imposed on the underlying data generating mechanism, we find the asymptotic equivalents to the optimal shrinkage intensities and estimate them consistently. The proposed nonparametric estimator for the high-dimensional mean vector has a simple structure and is proven to minimize asymptotically, with probability 1, the quadratic loss when c is an element of(0, 1). When c is an element of(1, infinity) we modify the estimator by using a feasible estimator for the precision covariance matrix. To this end, an exhaustive simulation study and an application to real data are provided where the proposed estimator is compared with known benchmarks from the literature. It turns out that the existing estimators of the mean vector, including the new proposal, converge to the sample mean vector when the true mean vector has an unbounded Euclidean norm. (C) 2018 Elsevier Inc. All rights reserved.
【 授权许可】
Free
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
10_1016_j_jmva_2018_07_004.pdf | 793KB | download |