论文标题

三元随机变量的拆分kl和pac-bayes-split-kl不平等现象

Split-kl and PAC-Bayes-split-kl Inequalities for Ternary Random Variables

论文作者

Wu, Yi-Shan, Seldin, Yevgeny

论文摘要

我们为独立有限随机变量的总和提供了一种新的量度不等式浓度,我们将其命名为拆分KL不等式。不平等特别适合三元随机变量,这些变量自然出现在各种问题中,包括分析分类中的多余损失,加权多数票的分析以及戒除学习。我们证明,对于三元随机变量,不平等与KL不平等,经验性的伯恩斯坦不平等和意外的伯恩斯坦不平等的不平等竞争同时竞争,而在某些方案中,所有这些方案都优于所有这些。它解决了Tolstikhin和Seldin [2013]和Mhammedi等人的一个公开问题。 [2019]当分布恰好接近二进制时,如何同时匹配KL不平等的组合功率以及伯恩斯坦不平等的功率在概率质量集中在中间值上时利用低方差。我们还得出了Pac-Bayes-Split-kl不平等,并将其与Pac-Bayes-kl,Pac-Bayes-Excirical-Bennett和Pac-Bayes不平衡的不平等现象进行了比较,以分析过剩损失的分析,并分析了几个UCI数据集的加权多级投票。最后但并非最不重要的一点是,我们的研究提供了经验的伯恩斯坦和意外的伯恩斯坦不平等现象及其Pac-bayes扩展的首次直接比较。

We present a new concentration of measure inequality for sums of independent bounded random variables, which we name a split-kl inequality. The inequality is particularly well-suited for ternary random variables, which naturally show up in a variety of problems, including analysis of excess losses in classification, analysis of weighted majority votes, and learning with abstention. We demonstrate that for ternary random variables the inequality is simultaneously competitive with the kl inequality, the Empirical Bernstein inequality, and the Unexpected Bernstein inequality, and in certain regimes outperforms all of them. It resolves an open question by Tolstikhin and Seldin [2013] and Mhammedi et al. [2019] on how to match simultaneously the combinatorial power of the kl inequality when the distribution happens to be close to binary and the power of Bernstein inequalities to exploit low variance when the probability mass is concentrated on the middle value. We also derive a PAC-Bayes-split-kl inequality and compare it with the PAC-Bayes-kl, PAC-Bayes-Empirical-Bennett, and PAC-Bayes-Unexpected-Bernstein inequalities in an analysis of excess losses and in an analysis of a weighted majority vote for several UCI datasets. Last but not least, our study provides the first direct comparison of the Empirical Bernstein and Unexpected Bernstein inequalities and their PAC-Bayes extensions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源