Clip21-SGD2M: new method with strong optimization and DP guarantees

Our new paper on distributed optimization with strong optimization and DP guarantees is out! We introduce Clip21-SGD2M, a method featuring a double momentum mechanism—one for managing stochastic noise and another for averaging DP noise. We establish optimal convergence guarantees in both deterministic and stochastic settings, along with a near-optimal privacy-utility tradeoff in the DP framework. Finally, our method demonstrates competitive performance in practice efficiently handling the noise in training neural networks.

comments powered by Disqus