论文标题

部分可观测时空混沌系统的无模型预测

On the convergence of decentralized gradient descent with diminishing stepsize, revisited

论文作者

Choi, Woocheol, Kim, Jimyeong

论文摘要

由于其在各个领域的广泛应用,分布式优化在近年来引起了很多兴趣。在这项工作中,我们重新审视了分散梯度下降的收敛性[A. nedi {} -a.ozdaglar(2009)]在由$$ x_i(t+1)给出的整个空间上\ frac {a} {(t+w)^p} $,$ 0 <p \ leq 1 $。在对总成本功能$ f $具有本地成本功能$ f_i $的总成本函数$ f $的强烈凸度假设下,当值$ o(t^{ - p})$的值$ a> 0 $ a> 0 $和$ w> 0 $时,我们表明序列将序列收敛到优化器。

Distributed optimization has received a lot of interest in recent years due to its wide applications in various fields. In this work, we revisit the convergence property of the decentralized gradient descent [A. Nedi{ć}-A.Ozdaglar (2009)] on the whole space given by $$ x_i(t+1) = \sum^m_{j=1}w_{ij}x_j(t) - α(t) \nabla f_i(x_i(t)), $$ where the stepsize is given as $α(t) = \frac{a}{(t+w)^p}$ with $0< p\leq 1$. Under the strongly convexity assumption on the total cost function $f$ with local cost functions $f_i$ not necessarily being convex, we show that the sequence converges to the optimizer with rate $O(t^{-p})$ when the values of $a>0$ and $w>0$ are suitably chosen.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源