Learning with structured sparsity

Junzhou Huang, Tong Zhang, Dimitris Metaxas

Research output: Contribution to journalArticlepeer-review

221 Scopus citations


This paper investigates a learning formulation called structured sparsity, which is a natural extension of the standard sparsity concept in statistical learning and compressive sensing. By allowing arbitrary structures on the feature set, this concept generalizes the group sparsity idea that has become popular in recent years. A general theory is developed for learning with structured sparsity, based on the notion of coding complexity associated with the structure. It is shown that if the coding complexity of the target signal is small, then one can achieve improved performance by using coding complexity regularization methods, which generalize the standard sparse regularization. Moreover, a structured greedy algorithm is proposed to efficiently solve the structured sparsity problem. It is shown that the greedy algorithm approximately solves the coding complexity optimization problem under appropriate conditions. Experiments are included to demonstrate the advantage of structured sparsity over standard sparsity on some real applications.

Original languageEnglish (US)
Pages (from-to)3371-3412
Number of pages42
JournalJournal of Machine Learning Research
StatePublished - Nov 1 2011

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence


  • Compressive sensing
  • Feature selection
  • Graph sparsity
  • Group sparsity
  • Sparse learning
  • Standard sparsity
  • Structured sparsity
  • Tree sparsity

Fingerprint Dive into the research topics of 'Learning with structured sparsity'. Together they form a unique fingerprint.

Cite this