论文标题

大规模内核脊回归的有效高参数调整

Efficient Hyperparameter Tuning for Large Scale Kernel Ridge Regression

论文作者

Meanti, Giacomo, Carratino, Luigi, De Vito, Ernesto, Rosasco, Lorenzo

论文摘要

内核方法为非参数学习提供了一种原则性的方法。尽管他们的基本实现范围很大,但最近的进步表明,近似求解器可以有效地处理大量数据集。这些解决方案的缺点是,没有照顾高参数调整,而是让用户执行。超参数在实践中至关重要,并且缺乏自动调整极大地阻碍了效率和可用性。在本文中,我们努力填补基于尼斯特罗姆近似值的内核脊回归的差距。在审查和对比了许多高参数调谐策略之后,我们提出了基于数据依赖性惩罚的复杂性正则标准,并讨论其有效优化。然后,我们进行了仔细而广泛的经验评估,突出了不同调整策略的优势和劣势。我们的分析表明,提出的方法的好处是,我们将大规模内核方法的库纳入了库中,以得出适应性调谐的溶液。

Kernel methods provide a principled approach to nonparametric learning. While their basic implementations scale poorly to large problems, recent advances showed that approximate solvers can efficiently handle massive datasets. A shortcoming of these solutions is that hyperparameter tuning is not taken care of, and left for the user to perform. Hyperparameters are crucial in practice and the lack of automated tuning greatly hinders efficiency and usability. In this paper, we work to fill in this gap focusing on kernel ridge regression based on the Nyström approximation. After reviewing and contrasting a number of hyperparameter tuning strategies, we propose a complexity regularization criterion based on a data dependent penalty, and discuss its efficient optimization. Then, we proceed to a careful and extensive empirical evaluation highlighting strengths and weaknesses of the different tuning strategies. Our analysis shows the benefit of the proposed approach, that we hence incorporate in a library for large scale kernel methods to derive adaptively tuned solutions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源