论文标题

歧视器合件的特征图蒸馏,用于GAN压缩

Discriminator-Cooperated Feature Map Distillation for GAN Compression

论文作者

Hu, Tie, Lin, Mingbao, You, Lizhou, Chao, Fei, Ji, Rongrong

论文摘要

尽管在图像产生中表现出色,但生成的对抗网络(GAN)因其对大量存储和密集计算的要求而臭名昭著。作为令人敬畏的“表演制造商”,知识蒸馏被证明在探索低价甘人方面特别有效。在本文中,我们调查了教师歧视者的不替代性,并提出了一种创造性的歧视者合作的蒸馏,缩写为DCD,以完善生成器的更好特征图。与特征图蒸馏中的常规像素到像素匹配方法相反,我们的DCD利用教师歧视器作为转换来推动学生生成器的中间结果,以感知到接近教师生成器的相应输出。此外,为了减轻GAN压缩模式崩溃,我们构建了一个协作性的对抗训练范式,在该范围内建立了教师歧视者,从而与我们的DCD一起建立了与学生生成器共同培训。与现有的GAN压缩方法相比,我们的DCD显示出较高的结果。例如,在降低了超过40倍的MAC和Cyclegan的80倍参数之后,我们将FID度量从61.53降低到48.24,而当前的SOTA方法仅具有51.92。该作品的源代码已在https://github.com/poopit/dcd-inficial中访问。

Despite excellent performance in image generation, Generative Adversarial Networks (GANs) are notorious for its requirements of enormous storage and intensive computation. As an awesome ''performance maker'', knowledge distillation is demonstrated to be particularly efficacious in exploring low-priced GANs. In this paper, we investigate the irreplaceability of teacher discriminator and present an inventive discriminator-cooperated distillation, abbreviated as DCD, towards refining better feature maps from the generator. In contrast to conventional pixel-to-pixel match methods in feature map distillation, our DCD utilizes teacher discriminator as a transformation to drive intermediate results of the student generator to be perceptually close to corresponding outputs of the teacher generator. Furthermore, in order to mitigate mode collapse in GAN compression, we construct a collaborative adversarial training paradigm where the teacher discriminator is from scratch established to co-train with student generator in company with our DCD. Our DCD shows superior results compared with existing GAN compression methods. For instance, after reducing over 40x MACs and 80x parameters of CycleGAN, we well decrease FID metric from 61.53 to 48.24 while the current SoTA method merely has 51.92. This work's source code has been made accessible at https://github.com/poopit/DCD-official.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源