论文标题

低度学习和多项式的度量熵

Low-degree learning and the metric entropy of polynomials

论文作者

Eskenazis, Alexandros, Ivanisvili, Paata, Streck, Lauritz

论文摘要

令$ \ {f} _ {f} _ {n,d} $为所有函数的类$ f:\ { - 1,1 \}^n \ to [-1,1] $在最多$ d $的$ n $ dipermensional-demensional-demensional二维离散超级立方度上。在本文的第一部分中,我们证明了学习$ \ Mathscr {f} _ {f} _ {n,d} $的任何(确定性或随机)算法,带有$ l_2 $ -Accuracy $ \ varepsilon $,至少需要至少$Ω因此,将锋利性确立为$ n \ to \ infty $的近期埃斯凯纳齐斯和伊万尼斯维利(2021)。为此,我们表明$ l_2 $ packing数字$ \ mathsf {m}(\ Mathscr {f} _ {n,d},\ | \ | \ cdot \ | _ {l_2},\ varepsilon)$ $ c(1- \ varepsilon)2^d \ log n \ leq \ log \ log \ mathsf {m}(\ Mathscr {f} _ {n,d,d},\ | \ | \ cdot \ cdot \ | _ | _ {l_ {l_ {l_ {l_ {l _ {l _ { n} {\ varepsilon^4} $$对于足够大$ n $,其中$ c,c> 0 $是通用常数。在本文的第二部分中,我们提出了一个对数上限,以实现有界近似多项式类别的随机查询复杂性,其傅立叶光谱集中在很少的子集上。作为应用程序,我们证明了学习给定程度的近似作者所需的随机查询数量的新估计值,具有快速衰减的傅立叶尾巴和恒定深度电路的功能。最后,我们获得了学习多项式类别$ \ mathscr {f} _ {n,d} $所需的查询数量的界限,而在查询和随机示例模型中没有错误。

Let $\mathscr{F}_{n,d}$ be the class of all functions $f:\{-1,1\}^n\to[-1,1]$ on the $n$-dimensional discrete hypercube of degree at most $d$. In the first part of this paper, we prove that any (deterministic or randomized) algorithm which learns $\mathscr{F}_{n,d}$ with $L_2$-accuracy $\varepsilon$ requires at least $Ω((1-\sqrt{\varepsilon})2^d\log n)$ queries for large enough $n$, thus establishing the sharpness as $n\to\infty$ of a recent upper bound of Eskenazis and Ivanisvili (2021). To do this, we show that the $L_2$-packing numbers $\mathsf{M}(\mathscr{F}_{n,d},\|\cdot\|_{L_2},\varepsilon)$ of the concept class $\mathscr{F}_{n,d}$ satisfy the two-sided estimate $$c(1-\varepsilon)2^d\log n \leq \log \mathsf{M}(\mathscr{F}_{n,d},\|\cdot\|_{L_2},\varepsilon) \leq \frac{2^{Cd}\log n}{\varepsilon^4}$$ for large enough $n$, where $c, C>0$ are universal constants. In the second part of the paper, we present a logarithmic upper bound for the randomized query complexity of classes of bounded approximate polynomials whose Fourier spectra are concentrated on few subsets. As an application, we prove new estimates for the number of random queries required to learn approximate juntas of a given degree, functions with rapidly decaying Fourier tails and constant depth circuits of given size. Finally, we obtain bounds for the number of queries required to learn the polynomial class $\mathscr{F}_{n,d}$ without error in the query and random example models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源