TY - JOUR
T1 - KoPA
T2 - Automated Kronecker Product Approximation
AU - Cai, Chencheng
AU - Chen, Rong
AU - Xiao, Han
N1 - Funding Information:
Chen’s research is supported in part by National Science Foundation grants DMS-1737857, IIS-1741390, CCF-1934924, DMS-2027855 and DMS-2052949. Xiao’s research is supported in part by National Science Foundation grant DMS-1454817, DMS-2027855, DMS-2052949 and a research grant from NEC Labs America. The authors thank an Associate Editor and two referees for their insightful comments that have led to significant improvement of the manuscript.
Publisher Copyright:
©2022 Chencheng Cai, Rong Chen and Han Xiao.
PY - 2022/8/1
Y1 - 2022/8/1
N2 - We consider the problem of matrix approximation and denoising induced by the Kronecker product decomposition. Specifically, we propose to approximate a given matrix by the sum of a few Kronecker products of matrices, which we refer to as the Kronecker product approximation (KoPA). Because the Kronecker product is an extensions of the outer product from vectors to matrices, KoPA extends the low rank matrix approximation, and includes it as a special case. Comparing with the latter, KoPA also offers a greater flexibility, since it allows the user to choose the configuration, which are the dimensions of the two smaller matrices forming the Kronecker product. On the other hand, the configuration to be used is usually unknown, and needs to be determined from the data in order to achieve the optimal balance between accuracy and parsimony. We propose to use extended information criteria to select the configuration. Under the paradigm of high dimensional analysis, we show that the proposed procedure is able to select the true configuration with probability tending to one, under suitable conditions on the signal-to-noise ratio. We demonstrate the superiority of KoPA over the low rank approximations through numerical studies, and several benchmark image examples.
AB - We consider the problem of matrix approximation and denoising induced by the Kronecker product decomposition. Specifically, we propose to approximate a given matrix by the sum of a few Kronecker products of matrices, which we refer to as the Kronecker product approximation (KoPA). Because the Kronecker product is an extensions of the outer product from vectors to matrices, KoPA extends the low rank matrix approximation, and includes it as a special case. Comparing with the latter, KoPA also offers a greater flexibility, since it allows the user to choose the configuration, which are the dimensions of the two smaller matrices forming the Kronecker product. On the other hand, the configuration to be used is usually unknown, and needs to be determined from the data in order to achieve the optimal balance between accuracy and parsimony. We propose to use extended information criteria to select the configuration. Under the paradigm of high dimensional analysis, we show that the proposed procedure is able to select the true configuration with probability tending to one, under suitable conditions on the signal-to-noise ratio. We demonstrate the superiority of KoPA over the low rank approximations through numerical studies, and several benchmark image examples.
KW - Information Criterion
KW - Kronecker Product
KW - Low Rank Approximation
KW - Matrix Decomposition
KW - Random Matrix
UR - http://www.scopus.com/inward/record.url?scp=85148008774&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85148008774&partnerID=8YFLogxK
M3 - Article
AN - SCOPUS:85148008774
SN - 1532-4435
VL - 23
JO - Journal of Machine Learning Research
JF - Journal of Machine Learning Research
M1 - 236
ER -