You can view the full text of this article for free using the link below.

Title: Insights of computer vision-based techniques: perspective transformation and sliding window approach for lane line detection in autonomous vehicles

Authors: Madhuri Pagale; Sunanda Mulik; Richa Purohit; Anuradha Thakare

Addresses: CSE(AI & ML) Department, D.Y. Patil International University, India; Pimpri Chinchwad College of Engineering, India ' School of Computer Science Engineering and Applications, D.Y. Patil International University, Pune, India ' School of Computer Studies, Sri Balaji University, Pune, India ' Department of Computer Engineering, Pimpri Chinchwad College of Engineering, Pune, India

Abstract: The technique of sliding windows and perspective modification are employed in this work for lane identification. Then, binary thresholding is applied. We utilise state-of-the-art methods to obtain directional gradients and gradient magnitudes, enhancing the data necessary for lane identification. An essential first step is to change your frame of view so that you may examine things from a different angle. To effectively follow previously recognised lane pixels, we apply the sliding window method to lane recognition. Our method guarantees a smooth incorporation of lane information by superimposing the detected lane lines onto the initial image using a mask. We present a new metric for lane quality assessment that provides a quantitative measure of detection accuracy and is based on the mean and variance. Autonomous vehicles and driver-assistance systems stand to benefit greatly from this comprehensive strategy, which aims to elevate lane identification to a new level of excellence.

Keywords: autonomous vehicle; road lane lines; lane detection; lanes; road; safety; DAS; deep learning; perspective transformation; sliding window technique; algorithm; etc.

DOI: 10.1504/IJVP.2025.144276

International Journal of Vehicle Performance, 2025 Vol.11 No.1, pp.26 - 52

Received: 06 Feb 2023
Accepted: 27 Apr 2024

Published online: 04 Feb 2025 *

Full-text access for editors Full-text access for subscribers Free access Comment on this article