Longitudinal-Scanline-Based Arterial Traffic Video Analytics with Coordinate Transformation Assisted by 3D Infrastructure Data

Terry Tianya Zhang, Mengyang Guo, Peter J. Jin, Yi Ge, Jie Gong

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

High-resolution vehicle trajectory data can be used to generate a wide range of performance measures and facilitate many smart mobility applications for traffic operations and management. In this paper, a Longitudinal Scanline LiDAR-Camera model is explored for trajectory extraction at urban arterial intersections. The proposed model can efficiently detect vehicle trajectories under the complex, noisy conditions (e.g., hanging cables, lane markings, crossing traffic) typical of an arterial intersection environment. Traces within video footage are then converted into trajectories in world coordinates by matching a video image with a 3D LiDAR (Light Detection and Ranging) model through key infrastructure points. Using 3D LiDAR data will significantly improve the camera calibration process for real-world trajectory extraction. The pan-tilt-zoom effects of the traffic camera can be handled automatically by a proposed motion estimation algorithm. The results demonstrate the potential of integrating longitudinal-scanline-based vehicle trajectory detection and the 3D LiDAR point cloud to provide lane-by-lane high-resolution trajectory data. The resulting system has the potential to become a low-cost but reliable measure for future smart mobility systems.

Original languageEnglish (US)
Pages (from-to)338-357
Number of pages20
JournalTransportation Research Record
Volume2675
Issue number3
DOIs
StatePublished - Jan 11 2020

All Science Journal Classification (ASJC) codes

  • Civil and Structural Engineering
  • Mechanical Engineering

Fingerprint

Dive into the research topics of 'Longitudinal-Scanline-Based Arterial Traffic Video Analytics with Coordinate Transformation Assisted by 3D Infrastructure Data'. Together they form a unique fingerprint.

Cite this