论文标题

FedADC:通过漂移控制加速联合学习

FedADC: Accelerated Federated Learning with Drift Control

论文作者

Ozfatura, Kerem, Ozfatura, Emre, Gunduz, Deniz

论文摘要

联合学习(FL)已成为与隐私有关的边缘设备之间协作学习的事实框架。 FL策略的核心是以分布式方式使用随机梯度下降(SGD)。 FL的大规模实施带来了新的挑战,例如为SGD设计的加速技术纳入分布式设置,以及由于本地数据集的非均匀分布而引起的漂移问题。文献中已经分别研究了这两个问题。鉴于,在本文中,我们表明可以使用单个策略解决这两个问题,而无需对FL框架进行任何重大更改,或者引入其他计算和通信负载。为了实现这一目标,我们提出了FedAdc,这是一种具有漂移控制的加速FL算法。我们从经验上说明了FedAdc的优势。

Federated learning (FL) has become de facto framework for collaborative learning among edge devices with privacy concern. The core of the FL strategy is the use of stochastic gradient descent (SGD) in a distributed manner. Large scale implementation of FL brings new challenges, such as the incorporation of acceleration techniques designed for SGD into the distributed setting, and mitigation of the drift problem due to non-homogeneous distribution of local datasets. These two problems have been separately studied in the literature; whereas, in this paper, we show that it is possible to address both problems using a single strategy without any major alteration to the FL framework, or introducing additional computation and communication load. To achieve this goal, we propose FedADC, which is an accelerated FL algorithm with drift control. We empirically illustrate the advantages of FedADC.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源