论文标题
Drotrack:不确定性下的高速无人机对象跟踪
DroTrack: High-speed Drone-based Object Tracking Under Uncertainty
论文作者
论文摘要
我们提出了Drotrack,这是一种用于无人机捕获视频序列的高速视觉单对象跟踪框架。大多数现有的对象跟踪方法旨在应对众所周知的挑战,例如遮挡和混乱的背景。无人机的复杂运动,即三维空间中多个自由度的运动,会导致高不确定性。不确定性问题导致位置预测和规模估计的模糊性不准确。 Drotrack通过发现对象表示和运动几何形状之间的依赖性来解决此类问题。我们基于模糊C均值(FCM)实施有效的对象细分。我们将空间信息纳入成员函数,以聚集最歧视性段。然后,我们通过使用预训练的卷积神经网络(CNN)模型来增强对象分割。 Drotrack还利用几何角运动来估计可靠的物体量表。我们使用51,462个无人机捕获帧的两个数据集讨论了实验结果和性能评估。 FCM细分和角度缩放的结合增加了Dreotrack Precision,最高$ 9 \%$,并使中心位置错误平均减少了$ 162 $像素。与深度学习跟踪器相比,Drotrack优于所有高速跟踪器,并取得了可比的结果。 Drotrack提供的高帧速率最高为每秒1000帧(FPS),其位置精度最佳,而不是一组最新的实时跟踪器。
We present DroTrack, a high-speed visual single-object tracking framework for drone-captured video sequences. Most of the existing object tracking methods are designed to tackle well-known challenges, such as occlusion and cluttered backgrounds. The complex motion of drones, i.e., multiple degrees of freedom in three-dimensional space, causes high uncertainty. The uncertainty problem leads to inaccurate location predictions and fuzziness in scale estimations. DroTrack solves such issues by discovering the dependency between object representation and motion geometry. We implement an effective object segmentation based on Fuzzy C Means (FCM). We incorporate the spatial information into the membership function to cluster the most discriminative segments. We then enhance the object segmentation by using a pre-trained Convolution Neural Network (CNN) model. DroTrack also leverages the geometrical angular motion to estimate a reliable object scale. We discuss the experimental results and performance evaluation using two datasets of 51,462 drone-captured frames. The combination of the FCM segmentation and the angular scaling increased DroTrack precision by up to $9\%$ and decreased the centre location error by $162$ pixels on average. DroTrack outperforms all the high-speed trackers and achieves comparable results in comparison to deep learning trackers. DroTrack offers high frame rates up to 1000 frame per second (fps) with the best location precision, more than a set of state-of-the-art real-time trackers.