论文标题

贝叶斯最佳实验设计的设计摊销

Design Amortization for Bayesian Optimal Experimental Design

论文作者

Kennamer, Noble, Walton, Steven, Ihler, Alexander

论文摘要

贝叶斯最佳实验设计是统计数据的子场,旨在开发有效利用实验资源的方法。任何潜在的设计均以效用函数(例如(理论上完善的)预期信息增益(EIG)评估;但是,不幸的是,在大多数情况下,EIG是棘手的评估。在这项工作中,我们建立了成功的变分方法,该方法优化了针对特征界限的参数化变分模型。过去的工作着重于从头开始学习一个新的变分模型,以实现每个考虑的新设计。在这里,我们提出了一种新型的神经结构,该神经结构使实验者可以优化一个可以估计本特定无限多个设计的特征模型的单个变异模型。为了进一步提高计算效率,我们还建议在较便宜地评估的下限上训练变异模型,并从经验上表明,结果模型为更准确但昂贵的评估EIG的界限提供了出色的指南。我们证明了我们的技术对广义线性模型的有效性,这是一类统计模型,该模型广泛用于对照实验的分析。实验表明,我们的方法能够极大地提高现有近似策略的准确性,并以更好的样本效率来实现这些结果。

Bayesian optimal experimental design is a sub-field of statistics focused on developing methods to make efficient use of experimental resources. Any potential design is evaluated in terms of a utility function, such as the (theoretically well-justified) expected information gain (EIG); unfortunately however, under most circumstances the EIG is intractable to evaluate. In this work we build off of successful variational approaches, which optimize a parameterized variational model with respect to bounds on the EIG. Past work focused on learning a new variational model from scratch for each new design considered. Here we present a novel neural architecture that allows experimenters to optimize a single variational model that can estimate the EIG for potentially infinitely many designs. To further improve computational efficiency, we also propose to train the variational model on a significantly cheaper-to-evaluate lower bound, and show empirically that the resulting model provides an excellent guide for more accurate, but expensive to evaluate bounds on the EIG. We demonstrate the effectiveness of our technique on generalized linear models, a class of statistical models that is widely used in the analysis of controlled experiments. Experiments show that our method is able to greatly improve accuracy over existing approximation strategies, and achieve these results with far better sample efficiency.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源