论文标题

正规化适应性,以实现图像处理网络上稳定有效的连续学习

Regularized Adaptation for Stable and Efficient Continuous-Level Learning on Image Processing Networks

论文作者

Lee, Hyeongmin, Kim, Taeoh, Son, Hanbin, Baek, Sangwook, Cheon, Minsu, Lee, Sangyoun

论文摘要

在基于卷积神经网络(CNN)的图像处理中,大多数研究提出了针对单层(或单目标)优化的网络;因此,它们在其他级别上的表现不佳,必须重新培训才能提供最佳性能。使用多个模型覆盖多个级别涉及非常高的计算成本。为了解决这些问题,最近的方法在两个不同的层面上训练网络,并提出自己的插值方法以实现任意中间级别。但是,他们中的许多人无法适应艰巨的任务或平稳地插入,或者其他人仍然需要大量的内存和计算成本。在本文中,我们建议使用滤波器过渡网络(FTN)的新型连续级学习框架,该框架是一个非线性模块,很容易适应新的水平,并经常定期以防止不良的副作用。此外,对于FTN的稳定学习,我们新提出了一种初始化具有身份映射的非线性CNN的方法。此外,FTN是极轻的模块,因为它是一个独立于数据的模块,这意味着它不受输入的空间分辨率的影响。各种图像处理任务的广泛结果表明,FTN的性能在适应和插值方面是稳定的,并且与其他重型框架相当。

In Convolutional Neural Network (CNN) based image processing, most of the studies propose networks that are optimized for a single-level (or a single-objective); thus, they underperform on other levels and must be retrained for delivery of optimal performance. Using multiple models to cover multiple levels involves very high computational costs. To solve these problems, recent approaches train the networks on two different levels and propose their own interpolation methods to enable the arbitrary intermediate levels. However, many of them fail to adapt hard tasks or interpolate smoothly, or the others still require large memory and computational cost. In this paper, we propose a novel continuous-level learning framework using a Filter Transition Network (FTN) which is a non-linear module that easily adapt to new levels, and is regularized to prevent undesirable side-effects. Additionally, for stable learning of FTN, we newly propose a method to initialize non-linear CNNs with identity mappings. Furthermore, FTN is extremely lightweight module since it is a data-independent module, which means it is not affected by the spatial resolution of the inputs. Extensive results for various image processing tasks indicate that the performance of FTN is stable in terms of adaptation and interpolation, and comparable to that of the other heavy frameworks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源