论文标题

稀疏的贝叶斯优化

Sparse Bayesian Optimization

论文作者

Liu, Sulin, Feng, Qing, Eriksson, David, Letham, Benjamin, Bakshy, Eytan

论文摘要

贝叶斯优化(BO)是一种强大的方法,用于对黑盒目标函数的样本有效优化。但是,将BO应用于推荐系统等领域通常需要考虑配置的可解释性和简单性,该设置以前尚未在BO文献中研究过。为了使BO在此设置中有用,我们提出了几种基于正则化的方法,使我们能够发现稀疏和更容易解释的配置。我们提出了一种基于同型持续性的新型可区分放松,这使得通过直接与$ l_0 $正则化来靶向稀疏。我们确定了正则BO的故障模式,并开发了一种无高参数方法,稀疏性探索了贝叶斯优化(SEBO),该方法旨在同时最大化目标目标和稀疏性。对固定正则化的SEBO和方法对合成和现实世界中的问题进行了评估,我们表明我们能够有效地对稀疏性进行优化。

Bayesian optimization (BO) is a powerful approach to sample-efficient optimization of black-box objective functions. However, the application of BO to areas such as recommendation systems often requires taking the interpretability and simplicity of the configurations into consideration, a setting that has not been previously studied in the BO literature. To make BO useful for this setting, we present several regularization-based approaches that allow us to discover sparse and more interpretable configurations. We propose a novel differentiable relaxation based on homotopy continuation that makes it possible to target sparsity by working directly with $L_0$ regularization. We identify failure modes for regularized BO and develop a hyperparameter-free method, sparsity exploring Bayesian optimization (SEBO) that seeks to simultaneously maximize a target objective and sparsity. SEBO and methods based on fixed regularization are evaluated on synthetic and real-world problems, and we show that we are able to efficiently optimize for sparsity.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源