论文标题

使用生成对抗网络对单数分布的非参数估计的收敛速率

Rates of convergence for nonparametric estimation of singular distributions using generative adversarial networks

论文作者

Lee, Jeyong, Kwon, Hyeok Kyu, Chae, Minwoo

论文摘要

在非参数估计问题中常见的是,将一定的低维结构强加于未知参数以避免维度的诅咒。本文考虑了一个非参数分布估计问题,其结构假设允许目标分布相对于Lebesgue度量奇异。特别是,我们研究了生成对抗网络(GAN)在估计未知分布的使用,并获得相对于$ l^1 $ -Wasserstein度量的收敛速率。收敛速率仅取决于潜在的结构和噪声水平。更有趣的是,在相同的结构假设下,GAN的收敛速率严格比文献中已知的VAE速率快。我们还获得了最小值最佳速率的下限,至少在某些特殊情况下,它的最佳速率至少是锋利的。尽管最小值最佳速率的上限和下限不匹配,但差异并不显着。

It is common in nonparametric estimation problems to impose a certain low-dimensional structure on the unknown parameter to avoid the curse of dimensionality. This paper considers a nonparametric distribution estimation problem with a structural assumption under which the target distribution is allowed to be singular with respect to the Lebesgue measure. In particular, we investigate the use of generative adversarial networks (GANs) for estimating the unknown distribution and obtain a convergence rate with respect to the $L^1$-Wasserstein metric. The convergence rate depends only on the underlying structure and noise level. More interestingly, under the same structural assumption, the convergence rate of GAN is strictly faster than the known rate of VAE in the literature. We also obtain a lower bound for the minimax optimal rate, which is conjectured to be sharp at least in some special cases. Although our upper and lower bounds for the minimax optimal rate do not match, the difference is not significant.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源