论文标题
进行有效的非线性最小二乘的进行性批处理
Progressive Batching for Efficient Non-linear Least Squares
论文作者
论文摘要
非线性最小二乘求解器用于广泛的离线和实时模型拟合问题。基本高斯 - 纽顿基本算法铲球趋同的大多数改进都保证或利用了基础问题结构的稀疏性,以实现计算加速。通过利用大型数据集的深度学习方法的成功,随机优化方法最近受到了很多关注。我们的工作从随机机器学习和统计数据中借用了想法,我们提出了一种非线性最小二乘的方法,可以保证收敛,同时大大减少了所需的计算量。经验结果表明,与传统的二阶方法相比,我们提出的方法可以达到竞争性收敛率,例如图像对准和基本矩阵估计,并具有大量残差。
Non-linear least squares solvers are used across a broad range of offline and real-time model fitting problems. Most improvements of the basic Gauss-Newton algorithm tackle convergence guarantees or leverage the sparsity of the underlying problem structure for computational speedup. With the success of deep learning methods leveraging large datasets, stochastic optimization methods received recently a lot of attention. Our work borrows ideas from both stochastic machine learning and statistics, and we present an approach for non-linear least-squares that guarantees convergence while at the same time significantly reduces the required amount of computation. Empirical results show that our proposed method achieves competitive convergence rates compared to traditional second-order approaches on common computer vision problems, such as image alignment and essential matrix estimation, with very large numbers of residuals.