论文标题

使用特权信息学习,以实现有效的图像超分辨率

Learning with Privileged Information for Efficient Image Super-Resolution

论文作者

Lee, Wonkyung, Lee, Junghyup, Kim, Dohyung, Ham, Bumsub

论文摘要

在过去的十年中,卷积神经网络(CNN)允许单像超分辨率(SISR)的显着进步。大多数基于CNN的SR方法都致力于在经典方法上从PSNR和SSIM等质量指标(例如PSNR和SSIM)方面实现性能提高。他们通常需要大量的内存和计算单元。 FSRCNN由少量的卷积层组成,在使用极少量的网络参数的同时显示出令人鼓舞的结果。我们在本文中介绍了一个新颖的蒸馏框架,包括由教师和学生网络组成,可以大大提高FSRCNN的性能。为此,我们建议将地面真相高分辨率(HR)图像用作特权信息。教师中的编码器使用模仿损失学习了降解过程,HR图像的子采样。学生和老师的解码器具有与FSRCNN相同的网络架构,试图重建HR图像。解码器中的中级功能可负担得起学生的学习,并通过功能蒸馏将其转移给学生。标准基准的实验结果证明了我们框架的有效性和概括能力,这显着提高了FSRCNN的性能以及其他SR方法。我们的代码和模型可在线提供:https://cvlab.yonsei.ac.kr/projects/pisr。

Convolutional neural networks (CNNs) have allowed remarkable advances in single image super-resolution (SISR) over the last decade. Most SR methods based on CNNs have focused on achieving performance gains in terms of quality metrics, such as PSNR and SSIM, over classical approaches. They typically require a large amount of memory and computational units. FSRCNN, consisting of few numbers of convolutional layers, has shown promising results, while using an extremely small number of network parameters. We introduce in this paper a novel distillation framework, consisting of teacher and student networks, that allows to boost the performance of FSRCNN drastically. To this end, we propose to use ground-truth high-resolution (HR) images as privileged information. The encoder in the teacher learns the degradation process, subsampling of HR images, using an imitation loss. The student and the decoder in the teacher, having the same network architecture as FSRCNN, try to reconstruct HR images. Intermediate features in the decoder, affordable for the student to learn, are transferred to the student through feature distillation. Experimental results on standard benchmarks demonstrate the effectiveness and the generalization ability of our framework, which significantly boosts the performance of FSRCNN as well as other SR methods. Our code and model are available online: https://cvlab.yonsei.ac.kr/projects/PISR.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源