Title: Energy consumption optimisation for unmanned aerial vehicle based on reinforcement learning framework

Authors: Ziyue Wang; Yang Xing

Addresses: Department of Aerospace, Cranfield University, College Rd, Wharley End, Bedford MK43 0AL, UK ' Department of Aerospace, Cranfield University, College Rd, Wharley End, Bedford MK43 0AL, UK

Abstract: The average battery life of drones in use today is around 30 minutes, which poses significant limitations for ensuring long-range operation, such as seamless delivery and security monitoring. Meanwhile, the transportation sector is responsible for 93% of all carbon emissions, making it crucial to control energy usage during the operation of UAVs for future net-zero massive-scale air traffic. In this study, a reinforcement learning (RL)-based model was implemented for the energy consumption optimisation of drones. The RL-based energy optimisation framework dynamically tunes vehicle control systems to maximise energy economy while considering mission objectives, ambient circumstances, and system performance. RL was used to create a dynamically optimised vehicle control system that selects the most energy-efficient route. Based on training times, it is reasonable to conclude that a trained UAV saves between 50.1% and 91.6% more energy than an untrained UAV in this study by using the same map.

Keywords: power consumption; machine learning; reinforcement learning; RL; trajectory optimisation; Q-Learning; energy efficiency; path planning.

DOI: 10.1504/IJPT.2024.138001

International Journal of Powertrains, 2024 Vol.13 No.1, pp.75 - 94

Received: 03 Oct 2022
Accepted: 03 May 2023

Published online: 16 Apr 2024 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article