论文标题
持续的水平联合学习的异质数据
Continual Horizontal Federated Learning for Heterogeneous Data
论文作者
论文摘要
联合学习是一种有前途的机器学习技术,它使多个客户能够协作建立模型,而无需彼此揭示原始数据。在各种类型的联合学习方法中,水平联合学习(HFL)是研究最佳的类别,并且处理均匀的特征空间。但是,在异类特征空间的情况下,HFL仅使用常见的功能,并且不使用特定于客户端特征。在本文中,我们提出了一种使用名为“持续水平联合学习”(CHFL)的神经网络的HFL方法,这是一种通过利用每个客户的独特功能来提高HFL性能的持续学习方法。 CHFL分别将网络分别分为与共同特征和独特功能相对应的两列。它通过通过香草HFL使用共同特征来共同训练第一列,并通过使用唯一功能并通过横向连接来利用第一列的知识,而不会干扰它的联合训练。我们在各种现实世界数据集上进行实验,并表明CHFL极大地优于仅使用常用功能和本地学习的香草HFL,这些功能和本地学习使用每个客户端具有的所有功能。
Federated learning is a promising machine learning technique that enables multiple clients to collaboratively build a model without revealing the raw data to each other. Among various types of federated learning methods, horizontal federated learning (HFL) is the best-studied category and handles homogeneous feature spaces. However, in the case of heterogeneous feature spaces, HFL uses only common features and leaves client-specific features unutilized. In this paper, we propose a HFL method using neural networks named continual horizontal federated learning (CHFL), a continual learning approach to improve the performance of HFL by taking advantage of unique features of each client. CHFL splits the network into two columns corresponding to common features and unique features, respectively. It jointly trains the first column by using common features through vanilla HFL and locally trains the second column by using unique features and leveraging the knowledge of the first one via lateral connections without interfering with the federated training of it. We conduct experiments on various real world datasets and show that CHFL greatly outperforms vanilla HFL that only uses common features and local learning that uses all features that each client has.