论文标题

nope-sac:稀疏视图平面3D重建的神经单平面兰萨克

NOPE-SAC: Neural One-Plane RANSAC for Sparse-View Planar 3D Reconstruction

论文作者

Tan, Bin, Xue, Nan, Wu, Tianfu, Xia, Gui-Song

论文摘要

本文研究了严格的稀疏视图配置中具有挑战性的两视图3D重建,该配置在输入图像对中的对应关系不足以进行相机姿势估计。我们提出了一种新型的神经单平面兰萨克框架(简称为nope-sac),该框架具有出色的能力,可以从3D平面对应关系中学习单平方姿势假设。我们的Nope-SAC在暹罗平面检测网络的顶部建立,首先生成了具有粗糙初始姿势的推定平面对应关系。然后,它将对应的学到的3D平面参数馈入共享的MLP,以估算单平方摄像头姿势假设,随后以RANSAC方式重新呼叫以获得最终的相机姿势。由于神经一平面姿势可以最大程度地减少自适应姿势假设生成的平面对应关系的数量,因此它可以在稀疏视图输入的一些平面对应关系中稳定姿势投票和可靠的姿势改进。在实验中,我们证明我们的NOPE-SAC显着改善了具有严重视点变化的两视图输入的相机姿势估计,从而在两个具有挑战性的基准上设置了几种新的最新性能,即MatterPort3D和Scannet,即稀疏3D重建。源代码可在https://github.com/icetttb/nopesac上发布,用于可复制研究。

This paper studies the challenging two-view 3D reconstruction in a rigorous sparse-view configuration, which is suffering from insufficient correspondences in the input image pairs for camera pose estimation. We present a novel Neural One-PlanE RANSAC framework (termed NOPE-SAC in short) that exerts excellent capability to learn one-plane pose hypotheses from 3D plane correspondences. Building on the top of a siamese plane detection network, our NOPE-SAC first generates putative plane correspondences with a coarse initial pose. It then feeds the learned 3D plane parameters of correspondences into shared MLPs to estimate the one-plane camera pose hypotheses, which are subsequently reweighed in a RANSAC manner to obtain the final camera pose. Because the neural one-plane pose minimizes the number of plane correspondences for adaptive pose hypotheses generation, it enables stable pose voting and reliable pose refinement in a few plane correspondences for the sparse-view inputs. In the experiments, we demonstrate that our NOPE-SAC significantly improves the camera pose estimation for the two-view inputs with severe viewpoint changes, setting several new state-of-the-art performances on two challenging benchmarks, i.e., MatterPort3D and ScanNet, for sparse-view 3D reconstruction. The source code is released at https://github.com/IceTTTb/NopeSAC for reproducible research.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源