General maximum likelihood empirical bayes estimation of normal means

Wenhua Jiang, Cun Hui Zhang

Research output: Contribution to journalArticlepeer-review

78 Scopus citations

Abstract

We propose a general maximum likelihood empirical Bayes (GMLEB) method for the estimation of a mean vector based on observations with i.i.d. normal errors. We prove that under mild moment conditions on the unknown means, the average mean squared error (MSE) of the GMLEB is within an infinitesimal fraction of the minimum average MSE among all separable estimators which use a single deterministic estimating function on individual observations, provided that the risk is of greater order than (log n) 5/n. We also prove that the GMLEB is uniformly approximately minimax in regular and weak ℓp balls when the order of the length-normalized norm of the unknown means is between (log n) κ1/n1/(pΛ2) and n/(log n) κ2 . Simulation experiments demonstrate that the GMLEB outperforms the James-Stein and several state-of-the-art threshold estimators in a wide range of settings without much down side.

Original languageEnglish (US)
Pages (from-to)1647-1684
Number of pages38
JournalAnnals of Statistics
Volume37
Issue number4
DOIs
StatePublished - Aug 2009

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Keywords

  • Adaptive estimation
  • Compound estimation
  • Empirical bayes
  • Shrinkage estimator
  • Threshold estimator
  • White noise

Fingerprint Dive into the research topics of 'General maximum likelihood empirical bayes estimation of normal means'. Together they form a unique fingerprint.

Cite this