论文标题

在随机最小值优化中使梯度较小的近乎最佳算法

Near-Optimal Algorithms for Making the Gradient Small in Stochastic Minimax Optimization

论文作者

Chen, Lesi, Luo, Luo

论文摘要

我们研究了找到一个近乎平稳的点以进行平滑最小值优化的问题。最近提出的额外锚定梯度(EAG)方法在确定性设置中实现了凸 - 孔孔concove minimax问题的最佳收敛速率。但是,将EAG直接扩展到随机优化是不有效的。在本文中,我们设计了一种新型的随机算法,称为递归锚定迭代(Rain)。我们表明,在凸 - 孔和强烈的convex-strong-strong-conconcave病例中,降雨可实现近乎最佳的随机一阶甲骨文(SFO)复杂性。此外,我们将雨水的概念扩展到解决结构化的非Convex-Nonconcave minimax问题,并且还达到了近乎最佳的SFO复杂性。

We study the problem of finding a near-stationary point for smooth minimax optimization. The recently proposed extra anchored gradient (EAG) methods achieve the optimal convergence rate for the convex-concave minimax problem in the deterministic setting. However, the direct extension of EAG to stochastic optimization is not efficient. In this paper, we design a novel stochastic algorithm called Recursive Anchored IteratioN (RAIN). We show that the RAIN achieves near-optimal stochastic first-order oracle (SFO) complexity for stochastic minimax optimization in both convex-concave and strongly-convex-strongly-concave cases. In addition, we extend the idea of RAIN to solve structured nonconvex-nonconcave minimax problem and it also achieves near-optimal SFO complexity.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源