论文标题

使用基于深度学习的复杂事件处理和开放式交通摄像机的openstreetMap的流量预测框架

Traffic Prediction Framework for OpenStreetMap using Deep Learning based Complex Event Processing and Open Traffic Cameras

论文作者

Yadav, Piyush, Sarkar, Dipto, Salwala, Dhaval, Curry, Edward

论文摘要

显示近实时的流量信息是数字导航图的有用功能。但是,大多数商业提供商都依靠隐私委托措施,例如从手机获取位置信息来估算流量。缺乏使用开放数据平台的开源流量估算方法是用于在OpenStreetMap(OSM)上构建复杂导航服务的瓶颈。我们提出了一种基于深度学习的复杂事件处理(CEP)方法,该方法依赖于公开可用的摄像机流进行流量估算。所提出的框架执行近实时的对象检测,并通过在openstreetMap上可视化的结果进行多种与流量相关的多种措施,并在相机簇上进行属性提取。对象属性的估计(例如,车速,计数,方向)提供了多维数据,可以利用这些数据来创建超出常用密度测量的拥塞的指标和可视化。我们的方法通过将每辆车作为样品点及其速度作为重量,在插值过程中流动和计数措施。我们通过从伦敦街道处理22个交通摄像机,在OSM上展示了多维交通指标(例如流量,拥堵估算)。该系统的实现近实时性能为1.42秒,平均F评分为0.80。

Displaying near-real-time traffic information is a useful feature of digital navigation maps. However, most commercial providers rely on privacy-compromising measures such as deriving location information from cellphones to estimate traffic. The lack of an open-source traffic estimation method using open data platforms is a bottleneck for building sophisticated navigation services on top of OpenStreetMap (OSM). We propose a deep learning-based Complex Event Processing (CEP) method that relies on publicly available video camera streams for traffic estimation. The proposed framework performs near-real-time object detection and objects property extraction across camera clusters in parallel to derive multiple measures related to traffic with the results visualized on OpenStreetMap. The estimation of object properties (e.g. vehicle speed, count, direction) provides multidimensional data that can be leveraged to create metrics and visualization for congestion beyond commonly used density-based measures. Our approach couples both flow and count measures during interpolation by considering each vehicle as a sample point and their speed as weight. We demonstrate multidimensional traffic metrics (e.g. flow rate, congestion estimation) over OSM by processing 22 traffic cameras from London streets. The system achieves a near-real-time performance of 1.42 seconds median latency and an average F-score of 0.80.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源