论文标题

通过重叠估计指导本地功能匹配

Guide Local Feature Matching by Overlap Estimation

论文作者

Chen, Ying, Huang, Dihe, Xu, Shang, Liu, Jianlin, Liu, Yong

论文摘要

在较大的外观,观点和距离变化下,本地图像特征匹配具有挑战性但很重要。传统方法在整个图像中检测并匹配暂定的本地特征,并具有启发式一致性检查以确保可靠的匹配。在本文中,我们介绍了一种新颖的重叠估计方法,该方法在图像对与Transformer(名为OERTR)的图像对来约束通常可见区域中的局部特征匹配。 OETR在特征相关性的两步过程中执行重叠估计,然后重叠回归。作为预处理模块,可以将OERR插入任何现有的本地功能检测和匹配管道中,以减轻潜在的视角或尺度差异。密集实验表明,OERR可以大大提高最新的本地特征匹配性能,尤其是对于具有小的共享区域的图像对。该代码将在https://github.com/abyssgaze/oetr上公开获取。

Local image feature matching under large appearance, viewpoint, and distance changes is challenging yet important. Conventional methods detect and match tentative local features across the whole images, with heuristic consistency checks to guarantee reliable matches. In this paper, we introduce a novel Overlap Estimation method conditioned on image pairs with TRansformer, named OETR, to constrain local feature matching in the commonly visible region. OETR performs overlap estimation in a two-step process of feature correlation and then overlap regression. As a preprocessing module, OETR can be plugged into any existing local feature detection and matching pipeline, to mitigate potential view angle or scale variance. Intensive experiments show that OETR can boost state-of-the-art local feature matching performance substantially, especially for image pairs with small shared regions. The code will be publicly available at https://github.com/AbyssGaze/OETR.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源