论文标题

非参数样式转移

Non-Parametric Style Transfer

论文作者

Lee, Jeong-Sik, Choi, Hyun-Chul

论文摘要

最新的馈电神经方法的任意图像样式转移的转移主要使用的编码特征映射到其二阶统计数据,即线性转换内容图像的编码特征映射,以具有相同的均值和方差(或协方差)(或协方差)。在这项工作中,我们将二阶统计特征将匹配的二阶统计特征扩展到一般的分布匹配,以理解图像的样式由接收场的响应分布表示。对于此概括,首先,我们提出了一个新的特征转换层,该层与内容样式图像的特征映射图像的特征映射分布完全匹配。其次,我们分析了与我们的新功能变换层一致的最新样式损失,以训练一个解码器网络,该网络生成了从变换的功能映射传输图像的样式。根据我们的实验结果,证明使用我们的方法获得的风格化图像与所有现有样式测量中的目标样式图像更相似,而不会丢失内容清晰度。

Recent feed-forward neural methods of arbitrary image style transfer mainly utilized encoded feature map upto its second-order statistics, i.e., linearly transformed the encoded feature map of a content image to have the same mean and variance (or covariance) of a target style feature map. In this work, we extend the second-order statistical feature matching into a general distribution matching based on the understanding that style of an image is represented by the distribution of responses from receptive fields. For this generalization, first, we propose a new feature transform layer that exactly matches the feature map distribution of content image into that of target style image. Second, we analyze the recent style losses consistent with our new feature transform layer to train a decoder network which generates a style transferred image from the transformed feature map. Based on our experimental results, it is proven that the stylized images obtained with our method are more similar with the target style images in all existing style measures without losing content clearness.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源