Sparsity and the truncated ℓ2-norm

Research output: Contribution to journalConference article

1 Citation (Scopus)

Abstract

Sparsity is a fundamental topic in high-dimensional data analysis. Perhaps the most common measures of sparsity are the ℓp-norms, for 0 ≤ p < 2. In this paper, we study an alternative measure of sparsity the truncated ℓ2-norm, which is related to other ℓp-norms, but appears to have some unique and useful properties. Focusing on the n-dimensional Gaussian location model, we derive exact asymptotic minimax results for estimation over truncated ℓ2-balls, which complement existing results for ℓp-balls. We then propose simple new adaptive thresholding estimators that are inspired by the truncated ℓ2-norm and are adaptive asymptotic minimax over ℓp-balls (0 ≤ p < 2), as well as truncated ℓ2-balls. Finally, we derive lower bounds on the Bayes risk of an estimator, in terms of the parameter's truncated ℓ2-norm. These bounds provide necessary conditions for Bayes risk consistency in certain problems that are relevant for high-dimensional Bayesian modeling.

Original languageEnglish (US)
Pages (from-to)159-166
Number of pages8
JournalJournal of Machine Learning Research
Volume33
StatePublished - Jan 1 2014
Event17th International Conference on Artificial Intelligence and Statistics, AISTATS 2014 - Reykjavik, Iceland
Duration: Apr 22 2014Apr 25 2014

Fingerprint

Sparsity
Norm
Ball
Bayes Risk
Minimax
Adaptive Thresholding
Estimator
Bayesian Modeling
Location Model
Gaussian Model
High-dimensional Data
n-dimensional
Data analysis
High-dimensional
Complement
Lower bound
Necessary Conditions
Alternatives

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Cite this

@article{42aeece9f84f4dcbbd4cb8a030580e10,
title = "Sparsity and the truncated ℓ2-norm",
abstract = "Sparsity is a fundamental topic in high-dimensional data analysis. Perhaps the most common measures of sparsity are the ℓp-norms, for 0 ≤ p < 2. In this paper, we study an alternative measure of sparsity the truncated ℓ2-norm, which is related to other ℓp-norms, but appears to have some unique and useful properties. Focusing on the n-dimensional Gaussian location model, we derive exact asymptotic minimax results for estimation over truncated ℓ2-balls, which complement existing results for ℓp-balls. We then propose simple new adaptive thresholding estimators that are inspired by the truncated ℓ2-norm and are adaptive asymptotic minimax over ℓp-balls (0 ≤ p < 2), as well as truncated ℓ2-balls. Finally, we derive lower bounds on the Bayes risk of an estimator, in terms of the parameter's truncated ℓ2-norm. These bounds provide necessary conditions for Bayes risk consistency in certain problems that are relevant for high-dimensional Bayesian modeling.",
author = "Lee Dicker",
year = "2014",
month = "1",
day = "1",
language = "English (US)",
volume = "33",
pages = "159--166",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

Sparsity and the truncated ℓ2-norm. / Dicker, Lee.

In: Journal of Machine Learning Research, Vol. 33, 01.01.2014, p. 159-166.

Research output: Contribution to journalConference article

TY - JOUR

T1 - Sparsity and the truncated ℓ2-norm

AU - Dicker, Lee

PY - 2014/1/1

Y1 - 2014/1/1

N2 - Sparsity is a fundamental topic in high-dimensional data analysis. Perhaps the most common measures of sparsity are the ℓp-norms, for 0 ≤ p < 2. In this paper, we study an alternative measure of sparsity the truncated ℓ2-norm, which is related to other ℓp-norms, but appears to have some unique and useful properties. Focusing on the n-dimensional Gaussian location model, we derive exact asymptotic minimax results for estimation over truncated ℓ2-balls, which complement existing results for ℓp-balls. We then propose simple new adaptive thresholding estimators that are inspired by the truncated ℓ2-norm and are adaptive asymptotic minimax over ℓp-balls (0 ≤ p < 2), as well as truncated ℓ2-balls. Finally, we derive lower bounds on the Bayes risk of an estimator, in terms of the parameter's truncated ℓ2-norm. These bounds provide necessary conditions for Bayes risk consistency in certain problems that are relevant for high-dimensional Bayesian modeling.

AB - Sparsity is a fundamental topic in high-dimensional data analysis. Perhaps the most common measures of sparsity are the ℓp-norms, for 0 ≤ p < 2. In this paper, we study an alternative measure of sparsity the truncated ℓ2-norm, which is related to other ℓp-norms, but appears to have some unique and useful properties. Focusing on the n-dimensional Gaussian location model, we derive exact asymptotic minimax results for estimation over truncated ℓ2-balls, which complement existing results for ℓp-balls. We then propose simple new adaptive thresholding estimators that are inspired by the truncated ℓ2-norm and are adaptive asymptotic minimax over ℓp-balls (0 ≤ p < 2), as well as truncated ℓ2-balls. Finally, we derive lower bounds on the Bayes risk of an estimator, in terms of the parameter's truncated ℓ2-norm. These bounds provide necessary conditions for Bayes risk consistency in certain problems that are relevant for high-dimensional Bayesian modeling.

UR - http://www.scopus.com/inward/record.url?scp=84955479232&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84955479232&partnerID=8YFLogxK

M3 - Conference article

VL - 33

SP - 159

EP - 166

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -