TY - GEN
T1 - Stochastic gradient descent with differentially private updates
AU - Song, Shuang
AU - Chaudhuri, Kamalika
AU - Sarwate, Anand D.
PY - 2013
Y1 - 2013
N2 - Differential privacy is a recent framework for computation on sensitive data, which has shown considerable promise in the regime of large datasets. Stochastic gradient methods are a popular approach for learning in the data-rich regime because they are computationally tractable and scalable. In this paper, we derive differentially private versions of stochastic gradient descent, and test them empirically. Our results show that standard SGD experiences high variability due to differential privacy, but a moderate increase in the batch size can improve performance significantly.
AB - Differential privacy is a recent framework for computation on sensitive data, which has shown considerable promise in the regime of large datasets. Stochastic gradient methods are a popular approach for learning in the data-rich regime because they are computationally tractable and scalable. In this paper, we derive differentially private versions of stochastic gradient descent, and test them empirically. Our results show that standard SGD experiences high variability due to differential privacy, but a moderate increase in the batch size can improve performance significantly.
UR - http://www.scopus.com/inward/record.url?scp=84897680504&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84897680504&partnerID=8YFLogxK
U2 - 10.1109/GlobalSIP.2013.6736861
DO - 10.1109/GlobalSIP.2013.6736861
M3 - Conference contribution
AN - SCOPUS:84897680504
SN - 9781479902484
T3 - 2013 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2013 - Proceedings
SP - 245
EP - 248
BT - 2013 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2013 - Proceedings
T2 - 2013 1st IEEE Global Conference on Signal and Information Processing, GlobalSIP 2013
Y2 - 3 December 2013 through 5 December 2013
ER -