论文标题

RKHS中梯度Langevin动力学的无维度收敛速率

Dimension-free convergence rates for gradient Langevin dynamics in RKHS

论文作者

Muzellec, Boris, Sato, Kanji, Massias, Mathurin, Suzuki, Taiji

论文摘要

梯度Langevin Dynamics(GLD)和随机GLD(SGLD)最近引起了相当大的关注,作为在非凸面设置中提供收敛保证的一种方式。但是,已知的速率随空间的尺寸呈指数增长。在这项工作中,当优化空间是无限维度希尔伯特空间时,我们提供了GLD和SGLD的收敛分析。更确切地说,在重现的内核希尔伯特空间中执行正则非凸优化时,我们得出了GLD/SGLD的非反应,无维度的收敛速率。除其他外,收敛分析依赖于随机微分方程的性能,其离散的时间盖尔金近似值和相关的马尔可夫链的几何形状。

Gradient Langevin dynamics (GLD) and stochastic GLD (SGLD) have attracted considerable attention lately, as a way to provide convergence guarantees in a non-convex setting. However, the known rates grow exponentially with the dimension of the space. In this work, we provide a convergence analysis of GLD and SGLD when the optimization space is an infinite dimensional Hilbert space. More precisely, we derive non-asymptotic, dimension-free convergence rates for GLD/SGLD when performing regularized non-convex optimization in a reproducing kernel Hilbert space. Amongst others, the convergence analysis relies on the properties of a stochastic differential equation, its discrete time Galerkin approximation and the geometric ergodicity of the associated Markov chains.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源