TY - JOUR
T1 - Algorithmic analysis and statistical estimation of SLOPE via approximate message passing
AU - Bu, Zhiqi
AU - Klusowski, Jason M.
AU - Rush, Cynthia
AU - Su, Weijie
N1 - Funding Information:
∗Department of Applied Mathematics and Computational Science, University of Pennsylvania, Philadelphia, PA 19104, USA. Email: [email protected] †Department of Statistics, Rutgers University, New Brunswick, NJ 08854, USA. Email: [email protected] Supported in part by NSF DMS #1915932. ‡Department of Statistics, Columbia University, New York, NY 10027, USA. Email: [email protected] Supported in part by NSF CCF #1849883. §Department of Statistics, University of Pennsylvania, Philadelphia, PA 19104, USA. Email: [email protected] Supported in part by NSF DMS CAREER #1847415 and NSF CCF #1763314.
Publisher Copyright:
© 2019 Neural information processing systems foundation. All rights reserved.
PY - 2019
Y1 - 2019
N2 - SLOPE is a relatively new convex optimization procedure for high-dimensional linear regression via the sorted `1 penalty: the larger the rank of the fitted coefficient, the larger the penalty. This non-separable penalty renders many existing techniques invalid or inconclusive in analyzing the SLOPE solution. In this paper, we develop an asymptotically exact characterization of the SLOPE solution under Gaussian random designs through solving the SLOPE problem using approximate message passing (AMP). This algorithmic approach allows us to approximate the SLOPE solution via the much more amenable AMP iterates. Explicitly, we characterize the asymptotic dynamics of the AMP iterates relying on a recently developed state evolution analysis for non-separable penalties, thereby overcoming the difficulty caused by the sorted `1 penalty. Moreover, we prove that the AMP iterates converge to the SLOPE solution in an asymptotic sense, and numerical simulations show that the convergence is surprisingly fast. Our proof rests on a novel technique that specifically leverages the SLOPE problem. In contrast to prior literature, our work not only yields an asymptotically sharp analysis but also offers an algorithmic, flexible, and constructive approach to understanding the SLOPE problem.
AB - SLOPE is a relatively new convex optimization procedure for high-dimensional linear regression via the sorted `1 penalty: the larger the rank of the fitted coefficient, the larger the penalty. This non-separable penalty renders many existing techniques invalid or inconclusive in analyzing the SLOPE solution. In this paper, we develop an asymptotically exact characterization of the SLOPE solution under Gaussian random designs through solving the SLOPE problem using approximate message passing (AMP). This algorithmic approach allows us to approximate the SLOPE solution via the much more amenable AMP iterates. Explicitly, we characterize the asymptotic dynamics of the AMP iterates relying on a recently developed state evolution analysis for non-separable penalties, thereby overcoming the difficulty caused by the sorted `1 penalty. Moreover, we prove that the AMP iterates converge to the SLOPE solution in an asymptotic sense, and numerical simulations show that the convergence is surprisingly fast. Our proof rests on a novel technique that specifically leverages the SLOPE problem. In contrast to prior literature, our work not only yields an asymptotically sharp analysis but also offers an algorithmic, flexible, and constructive approach to understanding the SLOPE problem.
UR - http://www.scopus.com/inward/record.url?scp=85090169917&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85090169917&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85090169917
SN - 1049-5258
VL - 32
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
T2 - 33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019
Y2 - 8 December 2019 through 14 December 2019
ER -