Sample Complexity Bounds for Low-Separation-Rank Dictionary Learning

Mohsen Ghassemi, Zahra Shakeri, Waheed U. Bajwa, Anand D. Sarwate

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

This work addresses the problem of structured dictionary learning for computing sparse representations of tensor-structured data. It introduces a low-separation-rank dictionary learning (LSR-DL) model that better captures the structure of tensor data by generalizing the separable dictionary learning model. A dictionary with p columns that is generated from the LSR-DL model is shown to be locally identifiable from noisy observations with recovery error at most ρ given that the number of training samples scales with (# of degrees of freedom in the dictionary)×p2ρ-2.

Original languageEnglish (US)
Title of host publication2019 IEEE International Symposium on Information Theory, ISIT 2019 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2294-2298
Number of pages5
ISBN (Electronic)9781538692912
DOIs
StatePublished - Jul 2019
Event2019 IEEE International Symposium on Information Theory, ISIT 2019 - Paris, France
Duration: Jul 7 2019Jul 12 2019

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
Volume2019-July
ISSN (Print)2157-8095

Conference

Conference2019 IEEE International Symposium on Information Theory, ISIT 2019
Country/TerritoryFrance
CityParis
Period7/7/197/12/19

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Sample Complexity Bounds for Low-Separation-Rank Dictionary Learning'. Together they form a unique fingerprint.

Cite this