论文标题
多维中的凸回归:最小二乘估计器的次优性
Convex Regression in Multidimensions: Suboptimality of Least Squares Estimators
论文作者
论文摘要
在具有高斯错误的常规非参数回归模型下,最小二乘估计器(LSE)对凸函数的自然子类的估计值显示为估算$ d $维二二二维凸功能在平方误差损失中时的次级优势。考虑的特定函数类别包括:(i)在多层(随机设计中)支持的有界凸功能,(ii)Lipschitz凸函数在任何凸形域(随机设计中),(iii)凸功能支持在polytope上(固定设计)。对于这些类别中的每一个,证明LSE的风险均为$ n^{ - 2/d} $(最大为对数因素),而minimax风险为$ n^{ - 4/(d+4)} $,当$ d \ ge 5 $时。此外,对于所有$ d \ geq 1 $,在固定设计中建立了不受限制的凸LSE的第一个收敛结果(最坏情况和自适应)。还证明了一些具有独立感兴趣的凸功能的新的度量熵结果。
Under the usual nonparametric regression model with Gaussian errors, Least Squares Estimators (LSEs) over natural subclasses of convex functions are shown to be suboptimal for estimating a $d$-dimensional convex function in squared error loss when the dimension $d$ is 5 or larger. The specific function classes considered include: (i) bounded convex functions supported on a polytope (in random design), (ii) Lipschitz convex functions supported on any convex domain (in random design), (iii) convex functions supported on a polytope (in fixed design). For each of these classes, the risk of the LSE is proved to be of the order $n^{-2/d}$ (up to logarithmic factors) while the minimax risk is $n^{-4/(d+4)}$, when $d \ge 5$. In addition, the first rate of convergence results (worst case and adaptive) for the unrestricted convex LSE are established in fixed-design for polytopal domains for all $d \geq 1$. Some new metric entropy results for convex functions are also proved which are of independent interest.