论文标题
贝叶斯优化,输出加权最佳抽样
Bayesian Optimization with Output-Weighted Optimal Sampling
论文作者
论文摘要
在贝叶斯优化中,考虑到输出相对于输入的重要性是至关重要但具有挑战性的练习,因为它可以大大改善最终结果,但通常涉及不准确和繁琐的熵估计。我们从重要性采样理论的角度解决了问题,并主张使用似然比指导搜索算法朝向输入空间区域的搜索算法,在这些区域中,要最小化的目标函数假设异常小的值。似然比是抽样的重量,可以在每次迭代时计算而不会严重恶化算法的总体效率。特别是,它可以以使该方法在高维度中进行处理的方式进行近似。在这项工作中介绍的“可能加权”的采集功能可在许多应用程序中胜过其未加权的对应。
In Bayesian optimization, accounting for the importance of the output relative to the input is a crucial yet challenging exercise, as it can considerably improve the final result but often involves inaccurate and cumbersome entropy estimations. We approach the problem from the perspective of importance-sampling theory, and advocate the use of the likelihood ratio to guide the search algorithm towards regions of the input space where the objective function to be minimized assumes abnormally small values. The likelihood ratio acts as a sampling weight and can be computed at each iteration without severely deteriorating the overall efficiency of the algorithm. In particular, it can be approximated in a way that makes the approach tractable in high dimensions. The "likelihood-weighted" acquisition functions introduced in this work are found to outperform their unweighted counterparts in a number of applications.