The classical Random Matrix Theory studies asymptotic spectral properties of random matrices when their dimensions grow to infinity. In contrast, the non-asymptotic branch of the theory is focused on explicit high probability estimates that we can obtain for large enough, but fixed size random matrices. This goal naturally brings into play some beautiful methods of high-dimensional probability and geometry, such as the concentration of measure phenomenon. One of the less understood random matrix models is a heavy-tailed model. This is the case when the matrix entries have distributions with slower tail decay than gaussian, e.g., with a few finite moments only. This work is devoted to the study of the heavy-tailed matrices and addresses two main questions: invertibility and regularization of the operator norm.First, the invertibility result of Rudelson and Vershynin is generalized from the case when the matrix entries are subgaussianto the case when only two finite moments are required. Then, it is shown that the operator norm of a matrix can be reduced to the optimal order O(sqrt(n)) if and only if the entries have zero mean and finite variance. We also study the constructive ways to perform such regularization. We show that deletion of a few large entries regularizes the operator norm only if all matrix entries have more than two finite moments. In the case with exactly two finite moments, we propose an algorithm that zeroes out a small fraction of the matrix entries to achieve the operator norm of an almost optimal order O(sqrt(n*ln ln n)) Finally, if in the latter case the matrix has scaled Bernoulli entries, we get a stronger regularization algorithm that provides a) O(sqrt(n))-operator norm of the resulting matrix and b) simple structure of the ;;bad;; submatrix to be zeroed out.
【 预 览 】
附件列表
Files
Size
Format
View
Spectral Properties of Heavy-Tailed Random Matrices