论文标题

FedObd:通过联合学习有效培训大规模神经网络的机会主义块辍学

FedOBD: Opportunistic Block Dropout for Efficiently Training Large-scale Neural Networks through Federated Learning

论文作者

Chen, Yuanyuan, Chen, Zichen, Wu, Pengcheng, Yu, Han

论文摘要

大规模的神经网络具有相当大的表现力。它们非常适合工业应用中的复杂学习任务。但是,在当前联邦学习(FL)范式下,大型模型对训练提出了重大挑战。现有的有效FL培训的方法通常利用模型参数辍学。但是,操纵单个模型参数不仅在训练大规模FL模型时有意义地减少通信开销效率低下,而且还可能不利于扩展工作和模型性能,如最近的研究所示。为了解决这些问题,我们提出了联合的机会块辍学(FEDOBD)方法。关键的新颖性是,它将大规模模型分解为语义块,以便FL参与者可以机会上传量化的块,这些块被认为对训练该模型很重要,以供FL服务器进行聚合。基于多个现实世界数据集对FedOBD进行评估的四种最先进方法评估的广泛实验表明,与最佳性能基线方法相比,它将整体通信开销降低了88%以上,同时达到了最高的测试准确性。据我们所知,FEDOBD是在块级别而不是在单个参数级别上执行FL模型上辍学的第一种方法。

Large-scale neural networks possess considerable expressive power. They are well-suited for complex learning tasks in industrial applications. However, large-scale models pose significant challenges for training under the current Federated Learning (FL) paradigm. Existing approaches for efficient FL training often leverage model parameter dropout. However, manipulating individual model parameters is not only inefficient in meaningfully reducing the communication overhead when training large-scale FL models, but may also be detrimental to the scaling efforts and model performance as shown by recent research. To address these issues, we propose the Federated Opportunistic Block Dropout (FedOBD) approach. The key novelty is that it decomposes large-scale models into semantic blocks so that FL participants can opportunistically upload quantized blocks, which are deemed to be significant towards training the model, to the FL server for aggregation. Extensive experiments evaluating FedOBD against four state-of-the-art approaches based on multiple real-world datasets show that it reduces the overall communication overhead by more than 88% compared to the best performing baseline approach, while achieving the highest test accuracy. To the best of our knowledge, FedOBD is the first approach to perform dropout on FL models at the block level rather than at the individual parameter level.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源