Selective linearization for multi-block statistical learning

Research output: Contribution to journalArticlepeer-review

Abstract

We consider the problem of minimizing a sum of several convex non-smooth functions and discuss the selective linearization method (SLIN), which iteratively linearizes all but one of the functions and employs simple proximal steps. The algorithm is a form of multiple operator splitting in which the order of processing partial functions is not fixed, but rather determined in the course of calculations. SLIN is globally convergent for an arbitrary number of component functions without artificial duplication of variables. We report results from extensive numerical experiments in two statistical learning settings such as large-scale overlapping group Lasso and doubly regularized support vector machine. In each setting, we introduce novel and efficient solutions for solving sub-problems. The numerical results demonstrate the efficacy and accuracy of SLIN.

Original languageEnglish (US)
Pages (from-to)219-228
Number of pages10
JournalEuropean Journal of Operational Research
Volume293
Issue number1
DOIs
StatePublished - Aug 16 2021

All Science Journal Classification (ASJC) codes

  • Computer Science(all)
  • Modeling and Simulation
  • Management Science and Operations Research
  • Information Systems and Management

Keywords

  • Nonlinear programming
  • Penalized regression
  • Regularized support vector machine
  • Statistical learning

Fingerprint

Dive into the research topics of 'Selective linearization for multi-block statistical learning'. Together they form a unique fingerprint.

Cite this