论文标题

基于合奏的梯度推断,用于优化和采样中的粒子方法

Ensemble-based gradient inference for particle methods in optimization and sampling

论文作者

Schillings, Claudia, Totzeck, Claudia, Wacker, Philipp

论文摘要

我们提出了一种基于功能评估和贝叶斯推断的方法,以从给定的粒子合奏中提取目标函数的高阶差异信息。点评估$ \ {v(x^i)\} _ i $在集合$ \ \ {x^i \} _ i $中包含有关第一或高级导数的隐式信息,这些衍生物可以在很少的计算工作中显式(集合基于基于基于集合的梯度推断 - egi)。我们建议将这些信息用于改进基于合成的集合的数值方法,以优化和采样,例如基于共识的优化和基于langevin的采样器。数值研究表明,增强算法通常优于它们的无梯度变体,特别是增强的方法有助于合奏逃脱其初始域,探索多模式的非高斯环境,并加快优化动力学结束时的崩溃。}}。 该手稿中数值示例的代码可以在论文的GitHub存储库(https://github.com/mercurybench/ensemble-lasion-lasion.gradient.git)中找到。

We propose an approach based on function evaluations and Bayesian inference to extract higher-order differential information of objective functions {from a given ensemble of particles}. Pointwise evaluation $\{V(x^i)\}_i$ of some potential $V$ in an ensemble $\{x^i\}_i$ contains implicit information about first or higher order derivatives, which can be made explicit with little computational effort (ensemble-based gradient inference -- EGI). We suggest to use this information for the improvement of established ensemble-based numerical methods for optimization and sampling such as Consensus-based optimization and Langevin-based samplers. Numerical studies indicate that the augmented algorithms are often superior to their gradient-free variants, in particular the augmented methods help the ensembles to escape their initial domain, to explore multimodal, non-Gaussian settings and to speed up the collapse at the end of optimization dynamics.} The code for the numerical examples in this manuscript can be found in the paper's Github repository (https://github.com/MercuryBench/ensemble-based-gradient.git).

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源