论文标题

Soteriafl:统一的私人联邦学习框架,并通过交流压缩

SoteriaFL: A Unified Framework for Private Federated Learning with Communication Compression

论文作者

Li, Zhize, Zhao, Haoyu, Li, Boyue, Chi, Yuejie

论文摘要

为了在诸如无线网络之类的带宽环境中启用大规模的机器学习,最近在设计借助通信压缩的帮助下,在设计沟通效率的联合学习算法方面取得了重大进展。另一方面,保护隐私性,尤其是在客户层面上,是另一个重要的逃亡者,在存在高级通信压缩技术的情况下尚未同时解决。在本文中,我们提出了一个统一的框架,该框架通过沟通压缩提高了私人联邦学习的沟通效率。利用通用压缩操作员和局部差异隐私,我们首先检查了一种简单的算法,该算法将压缩直接应用于差异私密的随机梯度下降,并确定其局限性。然后,我们为私人联邦学习提出了一个统一的框架Soteriafl,该框架适合当地梯度估计量的一般家族,包括流行的随机方差减少梯度方法和最先进的变化压缩方案。我们在隐私,公用事业和沟通复杂性方面提供了其绩效权衡的全面表征,在这种情况下,Soterafl被证明可以在不牺牲隐私或实用性的情况下实现更好的沟通复杂性,而不是与其他私人联邦学习算法而没有沟通压缩。

To enable large-scale machine learning in bandwidth-hungry environments such as wireless networks, significant progress has been made recently in designing communication-efficient federated learning algorithms with the aid of communication compression. On the other end, privacy-preserving, especially at the client level, is another important desideratum that has not been addressed simultaneously in the presence of advanced communication compression techniques yet. In this paper, we propose a unified framework that enhances the communication efficiency of private federated learning with communication compression. Exploiting both general compression operators and local differential privacy, we first examine a simple algorithm that applies compression directly to differentially-private stochastic gradient descent, and identify its limitations. We then propose a unified framework SoteriaFL for private federated learning, which accommodates a general family of local gradient estimators including popular stochastic variance-reduced gradient methods and the state-of-the-art shifted compression scheme. We provide a comprehensive characterization of its performance trade-offs in terms of privacy, utility, and communication complexity, where SoteraFL is shown to achieve better communication complexity without sacrificing privacy nor utility than other private federated learning algorithms without communication compression.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源