Learning structured low-rank representation via matrix factorization

Jie Shen, Ping Li

Research output: Contribution to conferencePaperpeer-review

14 Scopus citations


A vast body of recent works in the literature have shown that exploring structures beyond data low-rankness can boost the performance of subspace clustering methods such as Low-Rank Representation (LRR). It has also been well recognized that the matrix factorization framework might offer more flexibility on pursuing underlying structures of the data. In this paper, we propose to learn structured LRR by factorizing the nuclear norm regularized matrix, which leads to our proposed non-convex formulation NLRR. Interestingly, this formulation of NLRR provides a general framework for unifying a variety of popular algorithms including LRR, dictionary learning, robust principal component analysis, sparse subspace clustering, etc. Several variants of NLRR are also proposed, for example, to promote sparsity while preserving low-rankness. We design a practical algorithm for NLRR and its variants, and establish theoretical guarantee for the stability of the solution and the convergence of the algorithm. Perhaps surprisingly, the computational and memory cost of NLRR can be reduced by roughly one order of magnitude compared to the cost of LRR. Experiments on extensive simulations and real datasets confirm the robustness of efficiency of NLRR and the variants.

Original languageEnglish (US)
Number of pages10
StatePublished - Jan 1 2016
Event19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016 - Cadiz, Spain
Duration: May 9 2016May 11 2016


Conference19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Statistics and Probability


Dive into the research topics of 'Learning structured low-rank representation via matrix factorization'. Together they form a unique fingerprint.

Cite this