论文标题

Kolmogorov宽度衰减和机器学习中的近似差:浅神经网络,随机特征模型和神经切线内核

Kolmogorov Width Decay and Poor Approximators in Machine Learning: Shallow Neural Networks, Random Feature Models and Neural Tangent Kernels

论文作者

E, Weinan, Wojtowytsch, Stephan

论文摘要

我们在给定BANACH空间的子空间之间建立了Kolmogorov宽度类型的比例分离,因为在其中一个子空间上,线性地图序列会收敛得更快。然后,将一般技术应用于表明,复制元素希尔伯特空间的$ l^2 $ - approximators用于高维度的两层神经网络的类别,并且具有小路径标准的多层网络是某些Lipschitz功能的较差的近似近似值,也是$ l^2 $ - 多功能学。

We establish a scale separation of Kolmogorov width type between subspaces of a given Banach space under the condition that a sequence of linear maps converges much faster on one of the subspaces. The general technique is then applied to show that reproducing kernel Hilbert spaces are poor $L^2$-approximators for the class of two-layer neural networks in high dimension, and that multi-layer networks with small path norm are poor approximators for certain Lipschitz functions, also in the $L^2$-topology.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源