TY - JOUR
T1 - Convergence of a stochastic subgradient method with averaging for nonsmooth nonconvex constrained optimization
AU - Ruszczyński, Andrzej
N1 - Publisher Copyright:
© 2020, Springer-Verlag GmbH Germany, part of Springer Nature.
PY - 2020/10/1
Y1 - 2020/10/1
N2 - We prove convergence of a single time-scale stochastic subgradient method with subgradient averaging for constrained problems with a nonsmooth and nonconvex objective function having the property of generalized differentiability. As a tool of our analysis, we also prove a chain rule on a path for such functions.
AB - We prove convergence of a single time-scale stochastic subgradient method with subgradient averaging for constrained problems with a nonsmooth and nonconvex objective function having the property of generalized differentiability. As a tool of our analysis, we also prove a chain rule on a path for such functions.
KW - Chain rule
KW - Generalized differentiable functions
KW - Nonsmooth optimization
KW - Stochastic subgradient method
UR - http://www.scopus.com/inward/record.url?scp=85078328498&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85078328498&partnerID=8YFLogxK
U2 - 10.1007/s11590-020-01537-8
DO - 10.1007/s11590-020-01537-8
M3 - Article
AN - SCOPUS:85078328498
SN - 1862-4472
VL - 14
SP - 1615
EP - 1625
JO - Optimization Letters
JF - Optimization Letters
IS - 7
ER -