Stochastic gradient descent with differentially private updates

Shuang Song, Kamalika Chaudhuri, Anand D. Sarwate

Research output: Chapter in Book/Report/Conference proceedingConference contribution

498 Scopus citations

Abstract

Differential privacy is a recent framework for computation on sensitive data, which has shown considerable promise in the regime of large datasets. Stochastic gradient methods are a popular approach for learning in the data-rich regime because they are computationally tractable and scalable. In this paper, we derive differentially private versions of stochastic gradient descent, and test them empirically. Our results show that standard SGD experiences high variability due to differential privacy, but a moderate increase in the batch size can improve performance significantly.

Original languageEnglish (US)
Title of host publication2013 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2013 - Proceedings
Pages245-248
Number of pages4
DOIs
StatePublished - 2013
Externally publishedYes
Event2013 1st IEEE Global Conference on Signal and Information Processing, GlobalSIP 2013 - Austin, TX, United States
Duration: Dec 3 2013Dec 5 2013

Publication series

Name2013 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2013 - Proceedings

Other

Other2013 1st IEEE Global Conference on Signal and Information Processing, GlobalSIP 2013
Country/TerritoryUnited States
CityAustin, TX
Period12/3/1312/5/13

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Stochastic gradient descent with differentially private updates'. Together they form a unique fingerprint.

Cite this