论文标题

私人加权随机步行随机梯度下降

Private Weighted Random Walk Stochastic Gradient Descent

论文作者

Ayache, Ghadir, Rouayheb, Salim El

论文摘要

我们考虑一个分散的学习设置,其中数据通过图中的节点分布。目的是学习分布式数据的全球模型,而无需涉及任何需要信任的中央实体。尽管基于八卦的随机梯度下降(SGD)可用于实现这一学习目标,但它会产生高沟通和计算成本,因为它必须等待所有节点的所有本地模型都可以融合。为了加快收敛速度​​,我们建议研究基于随机步行的SGD,其中根据图表上的随机步行进行了全局模型。我们根据两种类型的随机步道提出了两种算法,这些随机步行以分散的方式实现了数据的均匀采样和重要性采样。考虑到与数据和图相关的常数,我们提供了有关收敛速率的非反应分析。我们的数值结果表明,基于加权随机步行算法的高变化数据具有更好的性能。此外,我们提出了一种隐私的随机步行算法,该算法基于我们提出的伽马噪声机制来实现当地差异隐私。我们还为该算法的收敛性提供了数值结果,并表明它的表现优于基于添加剂的隐私机制。

We consider a decentralized learning setting in which data is distributed over nodes in a graph. The goal is to learn a global model on the distributed data without involving any central entity that needs to be trusted. While gossip-based stochastic gradient descent (SGD) can be used to achieve this learning objective, it incurs high communication and computation costs, since it has to wait for all the local models at all the nodes to converge. To speed up the convergence, we propose instead to study random walk based SGD in which a global model is updated based on a random walk on the graph. We propose two algorithms based on two types of random walks that achieve, in a decentralized way, uniform sampling and importance sampling of the data. We provide a non-asymptotic analysis on the rate of convergence, taking into account the constants related to the data and the graph. Our numerical results show that the weighted random walk based algorithm has a better performance for high-variance data. Moreover, we propose a privacy-preserving random walk algorithm that achieves local differential privacy based on a Gamma noise mechanism that we propose. We also give numerical results on the convergence of this algorithm and show that it outperforms additive Laplace-based privacy mechanisms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源