论文标题

未经训练的神经网络的压缩感应:梯度下降发现最平稳的近似

Compressive sensing with un-trained neural networks: Gradient descent finds the smoothest approximation

论文作者

Heckel, Reinhard, Soltanolkotabi, Mahdi

论文摘要

未经训练的卷积神经网络已成为图像恢复和恢复的非常成功的工具。他们能够通过简单地将神经网络模型拟合到单个图像或信号的测量中,而无需任何其他训练数据,从而解决了标准的反问题,例如去索和压缩感测和良好的结果。对于某些应用程序,这需要以早期停止优化的形式进行额外的正则化。但是,对于从几个测量值中恢复的信号恢复,未经训练的卷积网络具有有趣的自我调节属性:即使网络可以完全拟合任何图像,当用梯度下降训练直到收敛时,网络也从几个测量中恢复了自然图像。在本文中,我们为该属性提供了数值证据,并从理论上进行了研究。我们表明----没有任何进一步的正则化---未经训练的卷积神经网络可以大致从几乎最少的随机测量值中重建足够结构的信号和图像。

Un-trained convolutional neural networks have emerged as highly successful tools for image recovery and restoration. They are capable of solving standard inverse problems such as denoising and compressive sensing with excellent results by simply fitting a neural network model to measurements from a single image or signal without the need for any additional training data. For some applications, this critically requires additional regularization in the form of early stopping the optimization. For signal recovery from a few measurements, however, un-trained convolutional networks have an intriguing self-regularizing property: Even though the network can perfectly fit any image, the network recovers a natural image from few measurements when trained with gradient descent until convergence. In this paper, we provide numerical evidence for this property and study it theoretically. We show that---without any further regularization---an un-trained convolutional neural network can approximately reconstruct signals and images that are sufficiently structured, from a near minimal number of random measurements.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源