Linear Pooling of Sample Covariance Matrices

Elias Raninen, David E. Tyler, Esa Ollila

Research output: Contribution to journalArticlepeer-review

4 Scopus citations


We consider the problem of estimating high-dimensional covariance matrices of K-populations or classes in the setting where the sample sizes are comparable to the data dimension. We propose estimating each class covariance matrix as a distinct linear combination of all class sample covariance matrices. This approach is shown to reduce the estimation error when the sample sizes are limited, and the true class covariance matrices share a somewhat similar structure. We develop an effective method for estimating the coefficients in the linear combination that minimize the mean squared error under the general assumption that the samples are drawn from (unspecified) elliptically symmetric distributions possessing finite fourth-order moments. To this end, we utilize the spatial sign covariance matrix, which we show (under rather general conditions) to be an asymptotically unbiased estimator of the normalized covariance matrix as the dimension grows to infinity. We also show how the proposed method can be used in choosing the regularization parameters for multiple target matrices in a single class covariance matrix estimation problem. We assess the proposed method via numerical simulation studies including an application in global minimum variance portfolio optimization using real stock data.

Original languageEnglish (US)
Pages (from-to)659-672
Number of pages14
JournalIEEE Transactions on Signal Processing
StatePublished - 2022

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Electrical and Electronic Engineering


  • Covariance matrix
  • elliptical distribution
  • high-dimensional
  • multiclass
  • regularization
  • shrinkage
  • spatial sign covariance matrix


Dive into the research topics of 'Linear Pooling of Sample Covariance Matrices'. Together they form a unique fingerprint.

Cite this