论文标题

斯特拉格勒的差异性分散学习

Straggler-Resilient Differentially-Private Decentralized Learning

论文作者

Yakimenka, Yauhen, Weng, Chung-Wei, Lin, Hsuan-Yin, Rosnes, Eirik, Kliewer, Jörg

论文摘要

我们考虑在保留用户数据隐私的同时,通过逻辑环分散学习中的Straggler问题。尤其是,我们扩展了最近提出的差异隐私(DP)扩增框架,通过Cyffer和Belle进行了分散化,以包括整体培训延迟 - 计算和通信延迟。对于跳过方案(暂停后忽略了散乱者)和基线方案的分析结果均可得出收敛速度和DP级别,并且在训练继续之前等待每个节点完成。通过跳过方案的超时对参数进行参数的总体培训延迟,准确性和隐私之间的权衡,并经验验证了现实世界数据集上的逻辑回归,并使用MNIST和CIFAR-10数据集进行了图像分类。

We consider the straggler problem in decentralized learning over a logical ring while preserving user data privacy. Especially, we extend the recently proposed framework of differential privacy (DP) amplification by decentralization by Cyffers and Bellet to include overall training latency--comprising both computation and communication latency. Analytical results on both the convergence speed and the DP level are derived for both a skipping scheme (which ignores the stragglers after a timeout) and a baseline scheme that waits for each node to finish before the training continues. A trade-off between overall training latency, accuracy, and privacy, parameterized by the timeout of the skipping scheme, is identified and empirically validated for logistic regression on a real-world dataset and for image classification using the MNIST and CIFAR-10 datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源