Title: Deep reinforcement learning empowered energy efficient task-offloading in cloud-radio access networks

Authors: Naveen Kumar; Anwar Ahmad

Addresses: Department of Electronics and Communication Engineering, Jamia Millia Islamia, New Delhi, India ' Department of Electronics and Communication Engineering, Jamia Millia Islamia, New Delhi, India

Abstract: Mobile applications often demand computationally heavy resources to attain high quality, on the other hand, running all programs on a single mobile device still consumes a lot of energy and generates a lot of delay. By breaking a single operation into distinct components, partial computational offloading may be intelligently investigated in mobile edge computing (MEC) to minimise energy consumption and service latency of mobile device. Some of the components run on the mobile device itself, while the rest are sent to a mobile edge server. Current task offloading systems, primarily focus on average-based performance indicators, failing to satisfy the deadline constraints. This paper offers a deep reinforcement learning (DRL) empowered energy efficient task offloading method, to optimise the reward under task deadline constraints. Simulation results show that the proposed technique can efficiently transfer traffic from cloud-radio access networks to next-generation node B, while also saving energy by turning off underutilised baseband processing units.

Keywords: C-RAN; deep reinforcement learning; DRL; mobile edge computing; Q-learning; resource allocation; task offloading.

DOI: 10.1504/IJCNDS.2023.130569

International Journal of Communication Networks and Distributed Systems, 2023 Vol.29 No.3, pp.341 - 358

Received: 11 Oct 2021
Accepted: 09 May 2022

Published online: 28 Apr 2023 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article