论文标题
混合近端广义条件梯度方法和全面变化参数学习的应用
A hybrid proximal generalized conditional gradient method and application to total variation parameter learning
论文作者
论文摘要
在本文中,我们提出了一种解决优化问题的新方法,涉及两个适当的,凸,较低的半连续功能的总和,其中一种具有Lipschitz连续梯度。所提出的方法具有混合性质,结合了通常的前向后和通用条件梯度方法。我们建立了$ o(k^{ - 1/3})$的收敛速率在具有特定阶梯尺寸规则的温和假设下,并向总变化参数学习问题显示应用程序,这在非平滑凸凸优化的背景下证明了其优势。
In this paper we present a new method for solving optimization problems involving the sum of two proper, convex, lower semicontinuous functions, one of which has Lipschitz continuous gradient. The proposed method has a hybrid nature that combines the usual forward-backward and the generalized conditional gradient method. We establish a convergence rate of $o(k^{-1/3})$ under mild assumptions with a specific step-size rule and show an application to a total variation parameter learning problem, which demonstrates its benefits in the context of nonsmooth convex optimization.