论文标题

Feddar:联合领域意识代表学习

FedDAR: Federated Domain-Aware Representation Learning

论文作者

Zhong, Aoxiao, He, Hao, Ren, Zhaolin, Li, Na, Li, Quanzheng

论文摘要

跨核心联合学习(FL)已成为医疗保健机器学习应用程序中有前途的工具。它允许医院/机构在数据私有时使用足够的数据培训模型。为了确保在FL客户之间面对异质数据时,FL模型是可靠的,大多数努力都专注于为客户个性化模型。但是,客户数据之间的潜在关系被忽略了。在这项工作中,我们专注于一个特殊的非IID FL问题,称为域混合FL,其中每个客户的数据分布都被认为是几个预定域的混合物。认识到域的多样性和域内的相似性,我们提出了一种新颖的方法Feddar,该方法以脱钩的方式学习了域共享表示形式和域名个性化的预测头。对于简化的线性回归设置,我们从理论上证明了Feddar具有线性收敛速率。对于一般环境,我们已经对合成和现实医学数据集进行了深入的经验研究,这些研究表明了它优于先前的FL方法。

Cross-silo Federated learning (FL) has become a promising tool in machine learning applications for healthcare. It allows hospitals/institutions to train models with sufficient data while the data is kept private. To make sure the FL model is robust when facing heterogeneous data among FL clients, most efforts focus on personalizing models for clients. However, the latent relationships between clients' data are ignored. In this work, we focus on a special non-iid FL problem, called Domain-mixed FL, where each client's data distribution is assumed to be a mixture of several predefined domains. Recognizing the diversity of domains and the similarity within domains, we propose a novel method, FedDAR, which learns a domain shared representation and domain-wise personalized prediction heads in a decoupled manner. For simplified linear regression settings, we have theoretically proved that FedDAR enjoys a linear convergence rate. For general settings, we have performed intensive empirical studies on both synthetic and real-world medical datasets which demonstrate its superiority over prior FL methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源