论文标题
元数据:在元学习中加速深层图像
MetaDIP: Accelerating Deep Image Prior with Meta Learning
论文作者
论文摘要
深图像先验(DIP)是一种最近提出的技术,用于通过将重建图像拟合到未经训练的卷积神经网络的输出中来解决成像逆问题。与预处理的前馈神经网络不同,相同的倾角可以概括为任意反向问题,从降级到阶段检索,同时在每个任务下提供竞争性能。 DIP的核心缺点是,虽然前馈神经网络可以在单个通行证中重建图像,但DIP必须以大量的计算成本逐渐逐渐更新数百到数千个迭代的权重。在这项工作中,我们使用元学习大量加速了基于倾斜的重建。通过学习浸入权重的适当初始化,我们在一系列反向成像任务中表现出10倍的运行时间提高。此外,我们证明了经过训练以快速重建面孔的网络也将其推广以重建自然图像贴片。
Deep image prior (DIP) is a recently proposed technique for solving imaging inverse problems by fitting the reconstructed images to the output of an untrained convolutional neural network. Unlike pretrained feedforward neural networks, the same DIP can generalize to arbitrary inverse problems, from denoising to phase retrieval, while offering competitive performance at each task. The central disadvantage of DIP is that, while feedforward neural networks can reconstruct an image in a single pass, DIP must gradually update its weights over hundreds to thousands of iterations, at a significant computational cost. In this work we use meta-learning to massively accelerate DIP-based reconstructions. By learning a proper initialization for the DIP weights, we demonstrate a 10x improvement in runtimes across a range of inverse imaging tasks. Moreover, we demonstrate that a network trained to quickly reconstruct faces also generalizes to reconstructing natural image patches.