论文标题

部分可观测时空混沌系统的无模型预测

Deep Point-to-Plane Registration by Efficient Backpropagation for Error Minimizing Function

论文作者

Yatagawa, Tatsuya, Ohtake, Yutaka, Suzuki, Hiromasa

论文摘要

与最小化点对点距离相比,点设置的传统算法设定的注册最小化通常可以更好地估计刚性转换。然而,最近基于深度学习的方法最大程度地减少了点对点距离。与这些方法相反,本文提出了第一种基于深度学习的方法来置于平面登记的方法。这个问题的一个挑战性的部分是,用于点对点注册的典型解决方案需要迭代的过程来累积通过最小化线性的能量函数获得的小型转换。迭代显着增加了反向传播所需的计算图的大小,并且可以减慢前进和后网络评估。为了解决此问题,我们将估计的刚体转换视为输入点云的函数,并使用隐式函数定理得出其分析梯度。我们引入的分析梯度独立于如何获得误差最小化函数(即刚性变换),从而使我们能够有效地计算刚性变换及其梯度。我们在几种以前的方法上实现了所提出的平面注册模块,这些模块可以最大程度地减少点对点距离,并证明,即使具有噪声和低质量的点云,通过局部点分布估计的点云,扩展也优于基本方法。

Traditional algorithms of point set registration minimizing point-to-plane distances often achieve a better estimation of rigid transformation than those minimizing point-to-point distances. Nevertheless, recent deep-learning-based methods minimize the point-to-point distances. In contrast to these methods, this paper proposes the first deep-learning-based approach to point-to-plane registration. A challenging part of this problem is that a typical solution for point-to-plane registration requires an iterative process of accumulating small transformations obtained by minimizing a linearized energy function. The iteration significantly increases the size of the computation graph needed for backpropagation and can slow down both forward and backward network evaluations. To solve this problem, we consider the estimated rigid transformation as a function of input point clouds and derive its analytic gradients using the implicit function theorem. The analytic gradient that we introduce is independent of how the error minimizing function (i.e., the rigid transformation) is obtained, thus allowing us to calculate both the rigid transformation and its gradient efficiently. We implement the proposed point-to-plane registration module over several previous methods that minimize point-to-point distances and demonstrate that the extensions outperform the base methods even with point clouds with noise and low-quality point normals estimated with local point distributions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源