论文标题

层次上空喂养

Hierarchical Over-the-Air FedGradNorm

论文作者

Vahapoglu, Cemil, Mortaheb, Matin, Ulukus, Sennur

论文摘要

多任务学习(MTL)是一种学习范式,可以通过单个共享网络同时学习多个相关任务,在该网络中,每个任务都有一个独特的个性化标题网络进行微调。如果任务分布在客户端,并且客户具有单个共享网络,则可以将MTL集成到联合学习(FL)设置中,从而导致个性化的联合学习(PFL)。为了应对跨客户的联合环境中的统计异质性,这可能会大大降低学习绩效,我们使用分布式动态加权方法。要以功率和带宽限制的态度在嘈杂的频道上进行远程参数服务器(PS)和客户端之间的通信,我们利用了空中(OTA)聚合和层次结构联合学习(HFL)。因此,我们提出了一个动态的加权策略,提出了层次的无线PFL(HOTA)PFL,我们称之为Hota-Fedgradnorm。我们的算法在动态重量选择过程中考虑了通道条件。我们在无线通信系统数据集(RADCOMDYNAGIC)上进行实验。实验结果表明,与具有幼稚的静态相等加权策略的算法相比,使用HOTA-FEDGRADNORM的训练速度更快。此外,通过在动态重量选择过程中补偿通道条件,HOTA-FEDGRADNORM可为负通道效应提供鲁棒性。

Multi-task learning (MTL) is a learning paradigm to learn multiple related tasks simultaneously with a single shared network where each task has a distinct personalized header network for fine-tuning. MTL can be integrated into a federated learning (FL) setting if tasks are distributed across clients and clients have a single shared network, leading to personalized federated learning (PFL). To cope with statistical heterogeneity in the federated setting across clients which can significantly degrade the learning performance, we use a distributed dynamic weighting approach. To perform the communication between the remote parameter server (PS) and the clients efficiently over the noisy channel in a power and bandwidth-limited regime, we utilize over-the-air (OTA) aggregation and hierarchical federated learning (HFL). Thus, we propose hierarchical over-the-air (HOTA) PFL with a dynamic weighting strategy which we call HOTA-FedGradNorm. Our algorithm considers the channel conditions during the dynamic weight selection process. We conduct experiments on a wireless communication system dataset (RadComDynamic). The experimental results demonstrate that the training speed with HOTA-FedGradNorm is faster compared to the algorithms with a naive static equal weighting strategy. In addition, HOTA-FedGradNorm provides robustness against the negative channel effects by compensating for the channel conditions during the dynamic weight selection process.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源