Title: Grid-based lane identification with roadside LiDAR data

Authors: Jianqing Wu; Chen Lv; Hongya Yue

Addresses: School of Qilu Transportation, Shandong University, East Erhuan Road No.12550, Shizhong District, Jinan, 250061, China; Suzhou Research Institute, Shandong University, Building H of NUSP, No. 388 Ruoshui Road, SIP, Suzhou, Jiangsu, 215123, China ' School of Qilu Transportation, Shandong University, East Erhuan Road No. 12550, Shizhong District, Jinan, 250061, China ' Department of Technology Research and Development Center, Shandong Provincial Communications Planning and Design Institute Co. Ltd., Jinan, China

Abstract: Lane identification is important for many different applications, especially for connected-vehicle technologies. This paper presents a new method for lane identification with the roadside light detection and ranging (LiDAR) serving connected-vehicles. The proposed lane identification method is a revised grid-based clustering method (RGBC). The whole procedure includes background filtering, object clustering, object classification, and RGBC. A location matrix (LM) can be generated to store the location of each lane. The performance of the proposed method was evaluated with the data collected from the real world. The testing results showed that the RGBC can locate 96.73% of vehicles to the correct lane. The RGBC was also compared to the state of the art, showing that the computational load for RGBC is lowest compared to other algorithms, with a cost of slightly reduced accuracy. The time delay for real-time data processing is less than 0.1 ms, which can provide the high-resolution micro traffic data (HRMTD) for connected-vehicles.

Keywords: lane identification; roadside LiDAR; grid-based classification; connected vehicle; background filtering; object classification; micro traffic data.

DOI: 10.1504/IJSNET.2022.121158

International Journal of Sensor Networks, 2022 Vol.38 No.2, pp.85 - 96

Accepted: 06 Mar 2021
Published online: 28 Feb 2022 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article