论文标题

菜:分布式混合优化方法利用系统异质性

DISH: A Distributed Hybrid Optimization Method Leveraging System Heterogeneity

论文作者

Niu, Xiaochun, Wei, Ermin

论文摘要

我们研究了在多代理网络上的分布式优化问题,包括共识和网络流问题。现有的分布式方法忽略了代理商计算能力之间的异质性,从而限制了它们的有效性。为了解决这个问题,我们提出了一种利用系统异质性的分布式混合方法。 DISH允许具有较高计算能力或较低计算成本的代理执行本地牛顿型更新,而其他人则采用了更简单的梯度型更新。值得注意的是,菜肴涵盖了现有的方法,例如特殊情况,例如额外的挖掘,挖掘和ESOM-0。为了通过一般更新说明分析DISH的性能,我们将分布式问题提出为最小问题,并引入Grand(与梯度有关的上升和下降)及其交替版本Alt-Grand,以解决这些问题。大概括为集中式的Minimax设置,可容纳各种下降上升的更新方向,包括梯度型,牛顿型,缩放梯度和其他一般方向,在急性角度与部分梯度。理论分析建立了在强率 - 符号 - 非concave和强烈convex-PL设置中的大和替补剂的全球均方根和线性收敛速率,从而提供了盘式线性速率。此外,我们得出了集中式设置中基于牛顿的Grand变体的本地高等教育融合。数值实验验证了我们方法的有效性。

We study distributed optimization problems over multi-agent networks, including consensus and network flow problems. Existing distributed methods neglect the heterogeneity among agents' computational capabilities, limiting their effectiveness. To address this, we propose DISH, a distributed hybrid method that leverages system heterogeneity. DISH allows agents with higher computational capabilities or lower computational costs to perform local Newton-type updates while others adopt simpler gradient-type updates. Notably, DISH covers existing methods like EXTRA, DIGing, and ESOM-0 as special cases. To analyze DISH's performance with general update directions, we formulate distributed problems as minimax problems and introduce GRAND (gradient-related ascent and descent) and its alternating version, Alt-GRAND, for solving these problems. GRAND generalizes DISH to centralized minimax settings, accommodating various descent ascent update directions, including gradient-type, Newton-type, scaled gradient, and other general directions, within acute angles to the partial gradients. Theoretical analysis establishes global sublinear and linear convergence rates for GRAND and Alt-GRAND in strongly-convex-nonconcave and strongly-convex-PL settings, providing linear rates for DISH. In addition, we derive the local superlinear convergence of Newton-based variations of GRAND in centralized settings. Numerical experiments validate the effectiveness of our methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源