论文标题

最佳学习到强大的学习

Optimal Weak to Strong Learning

论文作者

Larsen, Kasper Green, Ritzert, Martin

论文摘要

经典的算法Adaboost允许转换一个弱学习者,这是一种算法,该算法产生的假设比机会略好,成为一个强大的学习者,在获得足够的培训数据时任意高精度。我们提出了一种新的算法,该算法从弱学习者中构建了一个强大的学习者,但比Adaboost和所有其他弱者到强大的学习者使用训练数据少,以实现相同的概括范围。样本复杂性下限表明我们的新算法使用最小可能的训练数据,因此是最佳的。因此,这项工作解决了从弱学习者中构建强大学习者的经典问题的样本复杂性。

The classic algorithm AdaBoost allows to convert a weak learner, that is an algorithm that produces a hypothesis which is slightly better than chance, into a strong learner, achieving arbitrarily high accuracy when given enough training data. We present a new algorithm that constructs a strong learner from a weak learner but uses less training data than AdaBoost and all other weak to strong learners to achieve the same generalization bounds. A sample complexity lower bound shows that our new algorithm uses the minimum possible amount of training data and is thus optimal. Hence, this work settles the sample complexity of the classic problem of constructing a strong learner from a weak learner.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源