论文标题

复合优化的随机一阶方法的一般收敛分析

General convergence analysis of stochastic first order methods for composite optimization

论文作者

Necoara, Ion

论文摘要

在本文中,我们考虑了随机函数的随机综合凸优化问题,可满足随机有限梯度条件,无论是否具有二次功能生长特性。这些模型包括文献中分析的最著名的目标函数类别:非平滑Lipschitz函数和(潜在的)非平滑函数和平滑函数的组成,具有或没有强凸度。基于我们的优化模型提供的灵活性,我们考虑了几种随机一阶方法的变体,例如随机近端梯度和随机近端点算法。通常,这些方法的收敛理论是针对满足限制性假设的简单随机优化模型得出的,速率通常是sublerear的,仅适用于特定的减小步骤。因此,我们分析了随机一阶方法的收敛速率,其在涵盖大量目标函数的一般假设下具有恒定或可变的步骤。对于恒定的步骤,我们表明这些方法可以将线性收敛速率达到与步骤大小成比例的恒定,并且在某些强大的随机有限梯度条件下,甚至纯线性收敛。此外,当选择一个变量步骤Zize时,我们将用于这些随机一阶方法的sublinear收敛速率。最后,在当前纸张中引入的随机梯度映射和Moreau平滑映射导致简单而直观的证明。

In this paper we consider stochastic composite convex optimization problems with the objective function satisfying a stochastic bounded gradient condition, with or without a quadratic functional growth property. These models include the most well-known classes of objective functions analyzed in the literature: non-smooth Lipschitz functions and composition of a (potentially) non-smooth function and a smooth function, with or without strong convexity. Based on the flexibility offered by our optimization model we consider several variants of stochastic first order methods, such as the stochastic proximal gradient and the stochastic proximal point algorithms. Usually, the convergence theory for these methods has been derived for simple stochastic optimization models satisfying restrictive assumptions, the rates are in general sublinear and hold only for specific decreasing stepsizes. Hence, we analyze the convergence rates of stochastic first order methods with constant or variable stepsize under general assumptions covering a large class of objective functions. For constant stepsize we show that these methods can achieve linear convergence rate up to a constant proportional to the stepsize and under some strong stochastic bounded gradient condition even pure linear convergence. Moreover, when a variable stepsize is chosen we derive sublinear convergence rates for these stochastic first order methods. Finally, the stochastic gradient mapping and the Moreau smoothing mapping introduced in the present paper lead to simple and intuitive proofs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源