Convergence of a stochastic subgradient method with averaging for nonsmooth nonconvex constrained optimization

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

We prove convergence of a single time-scale stochastic subgradient method with subgradient averaging for constrained problems with a nonsmooth and nonconvex objective function having the property of generalized differentiability. As a tool of our analysis, we also prove a chain rule on a path for such functions.

Original languageEnglish (US)
Pages (from-to)1615-1625
Number of pages11
JournalOptimization Letters
Volume14
Issue number7
DOIs
StatePublished - Oct 1 2020

All Science Journal Classification (ASJC) codes

  • Control and Optimization

Keywords

  • Chain rule
  • Generalized differentiable functions
  • Nonsmooth optimization
  • Stochastic subgradient method

Fingerprint

Dive into the research topics of 'Convergence of a stochastic subgradient method with averaging for nonsmooth nonconvex constrained optimization'. Together they form a unique fingerprint.

Cite this