论文标题
贝叶斯后近似与随机合奏
Bayesian posterior approximation with stochastic ensembles
论文作者
论文摘要
我们介绍了随机神经网络的合奏,以近似贝叶斯后部,将随机方法(例如辍学方法与深层集合)结合在一起。随机合奏被称为分布家族,并接受了训练以近似于贝叶斯后部的训练。我们基于蒙特卡洛辍学,DropConnect和新颖的非参数辍学版本实现随机合奏,并在玩具问题和CIFAR图像分类中对其进行评估。对于这两项任务,我们直接针对汉密尔顿蒙特卡洛模拟来测试后载质量的质量。我们的结果表明,与其他流行的基线相比,随机集合提供了更准确的后估计,用于贝叶斯推断。
We introduce ensembles of stochastic neural networks to approximate the Bayesian posterior, combining stochastic methods such as dropout with deep ensembles. The stochastic ensembles are formulated as families of distributions and trained to approximate the Bayesian posterior with variational inference. We implement stochastic ensembles based on Monte Carlo dropout, DropConnect and a novel non-parametric version of dropout and evaluate them on a toy problem and CIFAR image classification. For both tasks, we test the quality of the posteriors directly against Hamiltonian Monte Carlo simulations. Our results show that stochastic ensembles provide more accurate posterior estimates than other popular baselines for Bayesian inference.