论文标题

联合随机梯度下降使自我诱导的动量产生

Federated Stochastic Gradient Descent Begets Self-Induced Momentum

论文作者

Yang, Howard H., Liu, Zuozhu, Fu, Yaru, Quek, Tony Q. S., Poor, H. Vincent

论文摘要

联合学习(FL)是一种新兴的机器学习方法,可以在移动边缘系统中应用,在该系统中,服务器和许多客户端通过使用客户的数据和计算资源进行统计模型进行培训,而无需直接暴露其隐私敏感的数据。我们表明,在这种设置中运行随机梯度下降(SGD)可以看作是在全局聚合过程中添加类似动量的术语。基于这一发现,我们通过考虑参数陈旧和通信资源的影响,进一步分析联合学习系统的收敛速率。这些结果可以提高对联合SGD算法的理解,并在稳定性分析和联合计算系统之间建立联系,这对系统设计人员可能很有用。

Federated learning (FL) is an emerging machine learning method that can be applied in mobile edge systems, in which a server and a host of clients collaboratively train a statistical model utilizing the data and computation resources of the clients without directly exposing their privacy-sensitive data. We show that running stochastic gradient descent (SGD) in such a setting can be viewed as adding a momentum-like term to the global aggregation process. Based on this finding, we further analyze the convergence rate of a federated learning system by accounting for the effects of parameter staleness and communication resources. These results advance the understanding of the Federated SGD algorithm, and also forges a link between staleness analysis and federated computing systems, which can be useful for systems designers.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源