论文标题
$ f $ divergence gan的渐近统计分析
Asymptotic Statistical Analysis of $f$-divergence GAN
论文作者
论文摘要
生成的对抗网络(GAN)在数据生成方面取得了巨大的成功。但是,其统计特性尚未完全理解。在本文中,我们考虑了GAN的一般$ f $ divergence公式的统计行为,其中包括Kullback- Leibler Divergence与最大似然原理密切相关。我们表明,对于正确指定的参数生成模型,在适当的规律性条件下,所有具有相同歧视类别类别的$ f $ divergence gans均在渐近上等效。此外,有了适当选择的本地歧视者,它们渐近地等同于最大似然估计。对于被误解的生成模型,具有不同$ f $ -Divergences {收敛到不同估计器}的gan,因此无法直接比较。但是,结果表明,对于某些常用的$ f $ diverences,原始$ f $ gan并不是最佳选择,因为当原始$ f $ gan配方中的歧视训练被逻辑回归所取代时,可以实现较小的渐近方差。由此产生的估计方法称为对抗梯度估计(年龄)。提供了实证研究来支持该理论,并证明了年龄的优势,而不是模型错误的原始$ f $ gan。
Generative Adversarial Networks (GANs) have achieved great success in data generation. However, its statistical properties are not fully understood. In this paper, we consider the statistical behavior of the general $f$-divergence formulation of GAN, which includes the Kullback--Leibler divergence that is closely related to the maximum likelihood principle. We show that for parametric generative models that are correctly specified, all $f$-divergence GANs with the same discriminator classes are asymptotically equivalent under suitable regularity conditions. Moreover, with an appropriately chosen local discriminator, they become equivalent to the maximum likelihood estimate asymptotically. For generative models that are misspecified, GANs with different $f$-divergences {converge to different estimators}, and thus cannot be directly compared. However, it is shown that for some commonly used $f$-divergences, the original $f$-GAN is not optimal in that one can achieve a smaller asymptotic variance when the discriminator training in the original $f$-GAN formulation is replaced by logistic regression. The resulting estimation method is referred to as Adversarial Gradient Estimation (AGE). Empirical studies are provided to support the theory and to demonstrate the advantage of AGE over the original $f$-GANs under model misspecification.