Distributed inexact Newton-type pursuit for non-convex sparse learning

Bo Liu, Xiao Tong Yuan, Lezi Wang, Qingshan Liu, Junzhou Huang, Dimitris N. Metaxas

Research output: Contribution to conferencePaperpeer-review

3 Scopus citations

Abstract

In this paper, we present a sample distributed greedy pursuit method for non-convex sparse learning under cardinality constraint. Given the training samples uniformly randomly partitioned across multiple machines, the proposed method alternates between local inexact sparse minimization of a Newton-type approximation and centralized global results aggregation. Theoretical analysis shows that for a general class of convex functions with Lipschitze continues Hessian, the method converges linearly with contraction factor scaling inversely to the local data size; whilst the communication complexity required to reach desirable statistical accuracy scales logarithmically with respect to the number of machines for some popular statistical learning models. For nonconvex objective functions, up to a local estimation error, our method can be shown to converge to a local stationary sparse solution with sub-linear communication complexity. Numerical results demonstrate the efficiency and accuracy of our method when applied to large-scale sparse learning tasks including deep neural nets pruning.

Original languageEnglish (US)
StatePublished - 2020
Event22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019 - Naha, Japan
Duration: Apr 16 2019Apr 18 2019

Conference

Conference22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019
Country/TerritoryJapan
CityNaha
Period4/16/194/18/19

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Distributed inexact Newton-type pursuit for non-convex sparse learning'. Together they form a unique fingerprint.

Cite this