论文标题
HeterOFL:针对异构客户的计算和沟通有效的联合学习
HeteroFL: Computation and Communication Efficient Federated Learning for Heterogeneous Clients
论文作者
论文摘要
联合学习(FL)是一种培训机器学习模型的方法,该模型是在大量可能异质客户(例如手机和IoT设备)上分发的私人数据。在这项工作中,我们提出了一个名为HeterOFL的新联合学习框架,以解决配备非常不同的计算和通信功能的异构客户。我们的解决方案可以实现具有不同计算复杂性的异质局部模型的培训,并且仍然产生单个全球推断模型。我们的方法首次挑战了本地模型必须与全球模型共享相同架构的现有工作的基本假设。我们展示了增强FL培训并进行广泛的经验评估的几种策略,包括三个数据集上三个模型架构的五个计算复杂性水平。我们表明,根据客户的功能,适应性分配子网络既是计算和通信有效的。
Federated Learning (FL) is a method of training machine learning models on private data distributed over a large number of possibly heterogeneous clients such as mobile phones and IoT devices. In this work, we propose a new federated learning framework named HeteroFL to address heterogeneous clients equipped with very different computation and communication capabilities. Our solution can enable the training of heterogeneous local models with varying computation complexities and still produce a single global inference model. For the first time, our method challenges the underlying assumption of existing work that local models have to share the same architecture as the global model. We demonstrate several strategies to enhance FL training and conduct extensive empirical evaluations, including five computation complexity levels of three model architecture on three datasets. We show that adaptively distributing subnetworks according to clients' capabilities is both computation and communication efficient.