论文标题

关于固定尺寸的内核无脊回归的不一致

On the Inconsistency of Kernel Ridgeless Regression in Fixed Dimensions

论文作者

Beaglehole, Daniel, Belkin, Mikhail, Pandit, Parthe

论文摘要

``良性过度拟合'',某些算法插入嘈杂的训练数据却表现良好的能力一直是最近兴趣的话题。我们使用固定的设计设置表明,重要的预测变量,具有翻译不变核的内核机器不会在固定尺寸中表现出良性过度。特别是,对于任何非零回归函数和任何(甚至自适应)带宽选择,估计的预测变量不会随着样本量的增加而收敛到地面真相。为了证明这些结果,我们为概括误差提供了精确的表达式,其分解在近似误差和估计误差方面为基于内核带宽的选择引起了权衡。我们的结果适用于常用的翻译不变的内核,例如高斯,拉普拉斯和库奇。

``Benign overfitting'', the ability of certain algorithms to interpolate noisy training data and yet perform well out-of-sample, has been a topic of considerable recent interest. We show, using a fixed design setup, that an important class of predictors, kernel machines with translation-invariant kernels, does not exhibit benign overfitting in fixed dimensions. In particular, the estimated predictor does not converge to the ground truth with increasing sample size, for any non-zero regression function and any (even adaptive) bandwidth selection. To prove these results, we give exact expressions for the generalization error, and its decomposition in terms of an approximation error and an estimation error that elicits a trade-off based on the selection of the kernel bandwidth. Our results apply to commonly used translation-invariant kernels such as Gaussian, Laplace, and Cauchy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源