TY - JOUR
T1 - Proximal methods for nonlinear programming
T2 - Double regularization and inexact subproblems
AU - Eckstein, Jonathan
AU - Silva, Paulo J.S.
N1 - Funding Information:
This research was supported in part by a Faculty Research Grant from Rutgers Business School—Newark and New Brunswick. This research was partially carried out while Paulo J.S. Silva was visiting RUTCOR and IMECC-UNICAMP. Supported by CNPq (grant 303030/2007-0), FAPESP (grant 2008/03823-0), and PRONEX-Optimization.
PY - 2010/6
Y1 - 2010/6
N2 - This paper describes the first phase of a project attempting to construct an efficient general-purpose nonlinear optimizer using an augmented Lagrangian outer loop with a relative error criterion, and an inner loop employing a state-of-the art conjugate gradient solver. The outer loop can also employ double regularized proximal kernels, a fairly recent theoretical development that leads to fully smooth sub-problems. We first enhance the existing theory to show that our approach is globally convergent in both the primal and dual spaces when applied to convex problems. We then present an extensive computational evaluation using the CUTE test set, showing that some aspects of our approach are promising, but some are not. These conclusions in turn lead to additional computational experiments suggesting where to next focus our theoretical and computational efforts.
AB - This paper describes the first phase of a project attempting to construct an efficient general-purpose nonlinear optimizer using an augmented Lagrangian outer loop with a relative error criterion, and an inner loop employing a state-of-the art conjugate gradient solver. The outer loop can also employ double regularized proximal kernels, a fairly recent theoretical development that leads to fully smooth sub-problems. We first enhance the existing theory to show that our approach is globally convergent in both the primal and dual spaces when applied to convex problems. We then present an extensive computational evaluation using the CUTE test set, showing that some aspects of our approach are promising, but some are not. These conclusions in turn lead to additional computational experiments suggesting where to next focus our theoretical and computational efforts.
KW - Augmented Lagrangians
KW - Nonlinear programming
KW - Proximal algorithms
UR - http://www.scopus.com/inward/record.url?scp=77954758389&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=77954758389&partnerID=8YFLogxK
U2 - 10.1007/s10589-009-9274-1
DO - 10.1007/s10589-009-9274-1
M3 - Article
AN - SCOPUS:77954758389
SN - 0926-6003
VL - 46
SP - 279
EP - 304
JO - Computational Optimization and Applications
JF - Computational Optimization and Applications
IS - 2
ER -