论文标题

具有修改后的RELU网络的非参数回归

Nonparametric regression with modified ReLU networks

论文作者

Beknazaryan, Aleksandr, Sang, Hailin

论文摘要

我们考虑使用修改后的Relu神经网络进行回归估计,其中首先通过函数$α$修改网络权重矩阵,然后再乘以输入向量。我们举例说明了连续的,分段线性函数$α$,以使用$ l_1 $的经验风险最小化的经验风险最小化器,并平方$ l_2 $ noperties,最多可对数因素,最小值的预测$β$ -S-Sm-Sm-SmortOl-s-Sm-Sm-Sm-Sm-Sm-Sm-Sm-Sm-Sm-Sm-Sm-Sm-Sm-Smompooth Function。

We consider regression estimation with modified ReLU neural networks in which network weight matrices are first modified by a function $α$ before being multiplied by input vectors. We give an example of continuous, piecewise linear function $α$ for which the empirical risk minimizers over the classes of modified ReLU networks with $l_1$ and squared $l_2$ penalties attain, up to a logarithmic factor, the minimax rate of prediction of unknown $β$-smooth function.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源