Abstract
The k-support-norm regularized minimization has recently been applied with success to sparse prediction problems. The proximal gradient method is conventionally used to minimize this composite model. However it tends to suffer from expensive iteration cost as the proximity operator associated with k-support-norm needs exhaustive searching operations and thus could be time consuming in large scale settings. In this paper, we reformulate the k-support-norm regularized formulation into an identical constrained formulation and propose a fully corrective Frank-Wolfe algorithm to minimize the constrained model. Our method is inspired by an interesting observation that the convex hull structure of the k-support-norm ball allows the application of Frank-Wolfe-type algorithms with low iteration complexity. The convergence behavior of the proposed algorithm is analyzed. Extensive numerical results in learning tasks including logistic regression and matrix pursuit demonstrate the substantially improved computational efficiency of our algorithm over the state-of-the-art proximal gradient algorithms.
Original language | English (US) |
---|---|
Pages (from-to) | 1760-1766 |
Number of pages | 7 |
Journal | IJCAI International Joint Conference on Artificial Intelligence |
Volume | 2016-January |
State | Published - 2016 |
Event | 25th International Joint Conference on Artificial Intelligence, IJCAI 2016 - New York, United States Duration: Jul 9 2016 → Jul 15 2016 |
All Science Journal Classification (ASJC) codes
- Artificial Intelligence