论文标题
具有动态精确度无衍生物优化的有效高参数调整
Efficient Hyperparameter Tuning with Dynamic Accuracy Derivative-Free Optimization
论文作者
论文摘要
许多机器学习解决方案被构成,因为优化问题依赖于良好的超参数。调整这些超参数的算法通常会访问对基础学习问题的精确解决方案,这通常是不切实际的。在这里,我们将最近的动态精度无衍生化优化方法应用于超参数调整,该方法允许对学习问题进行不精确的评估,同时保留收敛保证。我们测试了有关物流分类器学习弹性净权重问题的方法,并与固定精度方法相比证明了其稳健性和效率。这表明了一种有希望的方法来调整超参数调整,并具有融合保证和实践表现。
Many machine learning solutions are framed as optimization problems which rely on good hyperparameters. Algorithms for tuning these hyperparameters usually assume access to exact solutions to the underlying learning problem, which is typically not practical. Here, we apply a recent dynamic accuracy derivative-free optimization method to hyperparameter tuning, which allows inexact evaluations of the learning problem while retaining convergence guarantees. We test the method on the problem of learning elastic net weights for a logistic classifier, and demonstrate its robustness and efficiency compared to a fixed accuracy approach. This demonstrates a promising approach for hyperparameter tuning, with both convergence guarantees and practical performance.