论文标题
Stein变分梯度下降与黑盒变分推断之间的等效性
The equivalence between Stein variational gradient descent and black-box variational inference
论文作者
论文摘要
我们在两种流行的贝叶斯推断方法之间形式上了等效性:Stein变化梯度下降(SVGD)和Black-Box变异推断(BBVI)。特别是,我们表明当内核是神经切线核时,BBVI与SVGD完全相对应。此外,我们将SVGD和BBVI解释为内核梯度流。我们通过利用最近将SVGD视为概率分布空间中的梯度流并表明BBVI自然会激励该空间上的Riemannian结构的梯度流来做到这一点。我们观察到内核梯度流也描述了在生成对抗网络(GAN)训练中发现的动力学。因此,这项工作统一了各种推理和生成建模方面的几种现有技术,并将内核视为管理这些算法行为的基本对象,从而激发了对其属性的更深入分析。
We formalize an equivalence between two popular methods for Bayesian inference: Stein variational gradient descent (SVGD) and black-box variational inference (BBVI). In particular, we show that BBVI corresponds precisely to SVGD when the kernel is the neural tangent kernel. Furthermore, we interpret SVGD and BBVI as kernel gradient flows; we do this by leveraging the recent perspective that views SVGD as a gradient flow in the space of probability distributions and showing that BBVI naturally motivates a Riemannian structure on that space. We observe that kernel gradient flow also describes dynamics found in the training of generative adversarial networks (GANs). This work thereby unifies several existing techniques in variational inference and generative modeling and identifies the kernel as a fundamental object governing the behavior of these algorithms, motivating deeper analysis of its properties.