Forthcoming Articles

International Journal of Mobile Network Design and Innovation

International Journal of Mobile Network Design and Innovation (IJMNDI)

Forthcoming articles have been peer-reviewed and accepted for publication but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.

Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.

Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.

Online First articles are also listed here. Online First articles are fully citeable, complete with a DOI. They can be cited, read, and downloaded. Online First articles are published as Open Access (OA) articles to make the latest research available as early as possible.

Open AccessArticles marked with this Open Access icon are Online First articles. They are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.

Register for our alerting service, which notifies you by email when new issues are published online.

International Journal of Mobile Network Design and Innovation (14 papers in press)

Regular Issues

  • Privacy Protection Algorithms in Mobile Network Design and Innovation: Enhancing Security in Next-Generation Wireless Communication   Order a copy of this article
    by Lili Qiu 
    Abstract: The integration of block chain technology into e-commerce has successfully addressed several critical challenges, including the lack of transaction transparency, potential data security risks, and high payment costs. Mobile Edge Computing (MEC) enhances block chain functionality by providing computational power while simultaneously meeting the stringent requirements for high real-time performance and low latency in e-commerce transaction systems. However, certain limitations persist within MEC-enabled e-commerce consortium block chains, such as breaches of user privacy, vulnerabilities in consensus algorithms, and other security concerns. This study proposes a secure transaction model specifically designed for MEC-enabled e-commerce consortium block chains. The proposed model employs a lightweight encryption algorithm to ensure the confidentiality of user information and transaction data, thereby preventing unauthorised disclosures. The simulation results indicate that the proposed solution effectively reduces classification accuracy from 91.1% to 1.42%, with only a 0.17% padding overhead when applied to a real dataset.
    Keywords: True Positive(TP); Mobile Crowd Sensing (MCS); Generative adversarial networks (GAN); software development kit (SDK); False Negative (FN); Target Tracking Area Selection (TTAS); Anonymous Handover Pro.
    DOI: 10.1504/IJMNDI.2026.10072507
     
  • Design and Realisation of a Low-Power, High-Concurrency SDR-Based Communication System at 230 MHz   Order a copy of this article
    by Li Shang, Jiaju Zhang, Xianglong Meng, Wenjiang Pei, Junpeng Zhang 
    Abstract: Software-defined radio (SDR) devices have significantly advanced communication networks by reducing the cost and development time of radio frequency (RF) designs. Their programmability enhances system capabilities, making them valuable for both research and application-driven tasks. With the growth of the Internet of Things (IoT), the need to validate, process, and decode numerous incoming signals has increased an area where SDRs are highly effective. This paper explores the integration of surface acoustic wave (SAW) devices and SDRs for wireless, in situ sensor response measurements. SAW devices are employed for time delay analysis, while SDRs capture signal data across the 218230 MHz range using 1921 samples. The inverse Fourier transform is applied to convert frequency-domain data to the time domain. Signal quality is evaluated against measurements from a commercial vector network analyser (VNA). using a LimeSDR Mini, achieves reliable time delay detection with results closely matching those of the VNA.
    Keywords: Power Amplifier (PA); Local Oscillator (LO); Radio frequency (RF); High Frequency (HF); Angle of attack (AOA); Finite Impulse Response (FIR).
    DOI: 10.1504/IJMNDI.2026.10073945
     
  • QoS Scheme for Vehicular Communication on VANETs for Detecting Messages characteristics   Order a copy of this article
    by Pramod Kumar Sagar, Suman Avdhesh Yadav, Smita Sharma, S. Vikram Singh 
    Abstract: Vehicular Ad-Hoc Networks (VANETs) are widely recognized as a promising technology for enhancing road safety and improving the efficiency of transportation systems. Delivering Quality of Service (QoS) for Vehicle-to-Everything (V2X) networks, also known as Vehicle-to-Network (V2N) or Vehicle-to-Everything (V2E), is a serious challenge due to the mobility features of vehicles. This paper also investigates an art QoS-based framework for VANETs that applies message properties to recognise and assign priority means to the messages. Through this proposed method, the network configurations are adjusted, allowing critical messages to be prioritised and transmitted according to the guarantee. The proposed model obtained a 95.14% packet delivery ratio, 4.86% end-to-end delay, 93.62% network throughput, 89.98% Transmission Data rate, and 93.87% Bandwidth Utilisation. Simulation results prove the effectiveness of the proposed method in improving QoS parameters. With QoS support, VANETs can be used for practical applications, which can significantly improve VANETs by ensuring reliability and efficiency.
    Keywords: Vehicular Ad-Hoc Networks; Quality of Services; Infrastructure and Delay; Network Throughput; Transmission Data Rate; Network Configurations; Reliability and Efficiency.
    DOI: 10.1504/IJMNDI.2026.10074896
     
  • Mobile Network-Enabled Visualisation Framework for Real-Time Disaster Science Communication and Response Coordination   Order a copy of this article
    by Ting Gao 
    Abstract: This study proposes a comprehensive framework that integrates mobile network coverage prediction, artificial intelligence (AI), the Internet of Things (IoT), augmented reality (AR), and social media analytics to enhance real-time disaster communication and emergency response. The system utilises signal strength data and deep learning models to predict mobile network coverage and enable accurate remote environment mapping. A standard operating procedure (SOP)-based platform automates disaster response for fires, floods, and earthquakes through monitoring, control, and support modules. Augmented reality and gesture recognition facilitate real-time visualisation and interactive control via mobile interfaces. Social media and IoT data are processed using machine learning algorithms to detect emergencies and support informed decision-making. Simulated scenarios, including missile strikes and earthquakes, demonstrate the systems adaptability in managing urban resilience. Performance evaluations under varying traffic loads confirm the systems reliability and low latency, offering a scalable and efficient solution for modern disaster management.
    Keywords: Real-Time Visualisation; Disaster Management; Mobile Network Communication; Emergency Response Coordination; Situational Awareness; GIS-Based Mapping.
    DOI: 10.1504/IJMNDI.2026.10075086
     
  • Dynamic Bandwidth Allocation and Task Scheduling in Large-Scale IoT Networks   Order a copy of this article
    by Xuefei Xu 
    Abstract: The rapid advancement of mobile Internet and the proliferation of Internet of Things (IoT) applications have substantially increased demands for data transmission and processing. However, IoT models often operate in resource-constrained environments where real-time processing is critical, limiting their performance. To address these challenges, we propose Fishnet-6G, a novel framework that employs a mesh-based packet ordering and resource allocation strategy to optimise 6G networks. The network architecture is based on the Sierpinski Triangle, and Quantised Density Peak Clustering (QDPC) is used for efficient device connectivity. Cluster Heads (CHs) and Substitute CHs are dynamically selected using real-time data. Traffic prediction is enhanced through fair queue state assessment and an adaptive-rate Improved Deep Deterministic Policy Gradient (IMPDDPG) algorithm. Scheduling is managed using a Bayesian Game-Theoretic Approach (BGTA). Simulated using Network Simulator-3.26, Fishnet-6G demonstrates superior performance in throughput, latency, energy efficiency, and packet loss, offering a robust solution for 6G-IoT networks.
    Keywords: Bandwidth Allocation; Io T Networks; Fishnet-6G; Edge server; BGTA algorithm; IMPDDPG method.
    DOI: 10.1504/IJMNDI.2026.10075385
     
  • Spectrum Allocation and Optimisation of Wireless Communication Networks   Order a copy of this article
    by Palanikumar S, Ramamoorthy S, N. Ashok Kumar, Sheshang Degadwala 
    Abstract: The wireless communication network spectrum is a limited resource. With the rapid growth of mobile communication services in recent years, traditional spectrum allocation methods-typically based on fixed spectrum allocation strategies often lead to inefficient and uneven resource distribution. Consequently, there is an urgent need to address the issues of spectrum allocation and optimisation. The integration of semantic mobile computing within the Internet of Things (IoT), along with advancements in emerging bionic models, offers novel approaches to this challenge. Analytical results indicate that, in terms of performance, the success rate of the Swarm Fission Optimisation Algorithm (SFOA) surpasses that of the particle swarm optimisation (PSO) algorithm, with both achieving a 100% success rate in function evaluations. Furthermore, the SFOA algorithm demonstrates superior stability and accuracy compared to the other two algorithms. Its advantages become particularly pronounced under high signal-to-noise ratio (SNR) conditions.
    Keywords: Mobile computer (MC); LTE (Long-Term Evolution); (H-Io T); GAA's radio; Spectrum Allocation; Wireless Communication Networks; Graph theory-based models.
    DOI: 10.1504/IJMNDI.2026.10075429
     
  • Big DataPowered Artificial Intelligence Approaches for Security and Anomaly Detection in Mobile Network Infrastructures   Order a copy of this article
    by Shifu Zhang, Yawei Zhang, Boqun Cheng 
    Abstract: This project investigates anomaly detection and mobile network cybersecurity through the integration of AI and deep learning. The proposed method enables scalable deployment within network infrastructures and enhances detection accuracy by leveraging large-scale data collection, pre-processing, feature extraction, and hybrid model training. With the rapid expansion of 5G and the Internet of Things (IoT), ensuring security and anomaly detection presents critical challenges. AI-powered approaches offer adaptive solutions to address evolving threats effectively. While prior research has primarily focused on auto encoders, LSTM networks, and CNNs for IoT intrusion detection, limited attention has been given to anomaly detection and predictive modelling in cellular networks. The proposed framework incorporates automated encoding, CNNs, LSTMs, and federated learning to detect vulnerabilities in real time. Experimental results demonstrate superior performance, achieving 97.5% accuracy, 96.2% recall, and 96.8% F1-score, thereby validating the framework's effectiveness in advancing anomaly detection in mobile networks.
    Keywords: Big Data Analytics; Artificial Intelligence (AI); Machine Learning; Anomaly Detection; Mobile Network Security; Intrusion Detection; Cybersecurity; Data-Driven Intelligence.
    DOI: 10.1504/IJMNDI.2026.10075471
     
  • Intelligent Signal Processing and Spectrum Management: A Hybrid Framework Combining Optimised Stockwell Transform and Deep Learning for Wide Area Network Planning and Radar Wave Classification   Order a copy of this article
    by Bedilu Ababu Teka, Demissie Jobir, R.A.M. SEWAK SINGH, Vivek Singh Bhadouria, Bijaya Paikray 
    Abstract: Recent methods, such as spectrograms, Wigner-Ville distributions, or wavelet transforms, are highly sensitive to noise and interference and may produce distorted TFR, reducing the classification accuracy of radar wave classification. To address these problems, this study proposes a hybrid model integrating convolutional neural networks (CNNs), machine learning, and an optimised Stockwell transform (OpST). The optimisation of the Stockwell transform leverages particle swarm optimisation to maximise energy concentration. This proposed method captures high-resolution features when processed through CNN transfer learning models such as MobileNetV2 (MNv2). The simulation results have shown that the MNv2 model and support vector machine (SVM) achieved a precision of 100%, a recall of 99%, an F1-score of 99% for radar wave classification, and an average Accuracy of 99.61% among all the evaluated CNN Models. This innovative combination provides a comprehensive, and interference mitigation, marking a substantial advancement in the fields of signal processing and spectrum management.
    Keywords: Transfer Learning; Time-frequency; convolutional neural network; K-Nearest Neighbors; MobileNetV2; Support Vector Machine; Radar waveforms; Wide Area Network.
    DOI: 10.1504/IJMNDI.2026.10075472
     
  • Enhancing Network Security in Wireless Communication Systems Using Deep Learning   Order a copy of this article
    by Pratik Patel, N. Ashok Kumar, Vidhya S, Sheshang Degadwala 
    Abstract: With the increasing reliance on digital connectivity, ensuring security in wireless communication systems has become more critical than ever. Traditional Network Intrusion Detection Systems (NIDS) primarily rely on pattern recognition techniques. However, they often struggle to detect novel threats effectively, leading to limitations in their performance. This study addresses these challenges by leveraging advancements in computing, networking, and deep learning methodologies. The proposed approach involves pre-processing and transforming network data to enhance its utility for threat detection. A deep learning-based NIDS was developed utilising a Convolutional Neural Network (CNN) architecture. Key features were systematically selected, and dimensionality reduction techniques were applied to optimize the model's performance. The resulting system achieved a detection accuracy of 98.5%, demonstrating significant potential to outperform traditional methods. This approach represents a more robust and adaptive solution for addressing contemporary challenges in network security.
    Keywords: Quality of experience (Q o E); Wireless sensor networks (WSNs); Channel state information (CSI); Invader detection system (IDS); Local area network (LAN); Recursive feature elimination (RFE).
    DOI: 10.1504/IJMNDI.2026.10075473
     
  • Measurement and analysis of RSS in IoT using Bluetooth mesh networks   Order a copy of this article
    by Xiaobin Wu, Dan Luo 
    Abstract: People who work with Industry 4.0 need to be able to think quickly and find things by looking inside things. You can keep track of the steps each thing goes through as it's being made by putting parts and other things in the right places. In other words, someone can see something being made. Since GPS will not let you in, you'll need to come up with a new plan. We can think of a new way to quickly find things inside with beacons that have a received signal strength (RSS) indicator (RSSI). The study led to this plan. You can use more than one kind of machine learning at the same time to do this. These include convolutional neural networks (CNNs) and recurrent neural networks (RNNs), and regressors. They need to be trained on test beds in the real world. These are places where you can see how well each method works. We now know which ones work best, are easy to use, and help us find things inside. A live feed will be shown as the last test to see how well the translation works. From what we saw, the K-Nearest Neighbours method works the best. You now know how things stand and how much money your company really makes.
    Keywords: technology: Bluetooth; localisation improvement; optimal K for KNN and WKNN; high-level architecture Bluetooth mesh profile specification.
    DOI: 10.1504/IJMNDI.2025.10072671
     
  • Machine learning-based prediction of allocative localisation error in wireless sensor networks   Order a copy of this article
    by Guo Li, Hongyu Sheng 
    Abstract: There may be a better way to deal with and write about any mistakes that happen in wireless sensor networks (WSNs) if they happen in a style called allocative localisation errors, or ALEs. Three important schemes are adaptive neuro-fuzzy inference systems (ANFIS), Stacking Regression, and AdaBoostRegressors. Multiple regression methods are used by the stacking regression model to guess wrong traits in more than one variable. As part of the study, Mayfly optimisation algorithm (MOA) was used to make some well-known ways of making predictions more correct. Someone came up with a brand-new ensemble model called STASA + MFO for this work. It takes the Stacking R model and adds ADA and MOA to it. With an R2 value of 0.9913, this method did better than the others that were tried. The ADA + ANFIS model had an R2 value of 0.9810, and the ANMF model (ADA with MO) had an R2 value of 0.9824.
    Keywords: WSNs; wireless sensor networks; ALEs; allocative localisation errors; machine learning algorithm; metaheuristic algorithms.
    DOI: 10.1504/IJMNDI.2025.10071735
     
  • Nature-inspired based ensemble feature selection and stacked ensemble classifier fusion for Android malware detection   Order a copy of this article
    by Anuja A. Rajan, R. Durga 
    Abstract: Android is widely used for tablets and smartphones. Thus, Android app malware has grown quickly in recent years. Malware detection in these apps is effective with ML algorithms. A reliable and effective malware detection approach is still difficult due to the vast number of features and high-dimensional dataset that provides the lowest accuracy despite research and industry efforts. Feature selection involves finding and removing features from a dataset while retaining class label variance. Using dynamic analysis of Android malware samples, this study introduces the nature-inspired ensemble feature selection (NIEFS) and stacked ensemble classifier fusion (SECF) multi-classification models. The NIEFS model uses evolutionary computation methods like Fuzzy Membership Grasshopper Optimisation Algorithm (FMGOA), Lévy flight pigeon-inspired optimisation (LEFPIO), and Cauchy Operator Squirrel Search Algorithm (COSSA) to remove redundant or irrelevant features and select relevant ones to improve detection accuracy. Multilayer SECF integrates EC-created malware outputs using a Mutual Information (MI)-based ensemble approach, which has good detection accuracy. Machine learning methods like J48, mean weight deep belief network (MWDBN), REPTree, and Voted Perceptron can be combined. Finally, classifier performance was tested using MATLABR2020a, Precision (Pre), Recall (Rec), F-measure (FM), and Weighted F-measure.
    Keywords: AMD; Android malware detection; NIEFS; nature-inspired based ensemble feature selection; MWDBN; mean weight deep belief network; ensemble technique; LEFPIO; Lévy flight pigeon-inspired optimisation; COSSA; Cauchy operator squirrel search algorithm.
    DOI: 10.1504/IJMNDI.2025.10071599
     
  • Analysis of spectrum sensing using convolutional neural network   Order a copy of this article
    by Sankarsan Panda, D. Chitra Devi, M. Madhini, K. PeriyarSelvam 
    Abstract: This study uses a spectrum sensing method to examine spectrum sensing as a classification problem. It works with the help of a convolutional neural network-long short-term memory (CM-LSTM) network and a signal correlation matrix. Three signals picked up by the space antenna array were cross- correlated with each other, and a single signal was picked up to find out the time. It was made with an LSTM (long short- term memory) network. The LSTM is a great way to find traits that have something to do with time. Then, a list of the signals picked up by the array and the links between them were given to the LSTM classification model. I figured out how the messages fit into the timeline and location. There were a lot more good ways to find bands after this. We studied using models and found that the CM-LSTM spectrum-sensing algorithm works better than the SVM, GBM, RF, and ED-based spectrum-sensing algorithms.
    Keywords: CRN; cognitive radio network; CR; cognitive radio; SNR; signal-to-noise; matched filter detection (MFD); Ed; energy detection; CNN-LSTM; convolutional neural network-long short-term memory; ARQ; automatic repeat request.
    DOI: 10.1504/IJMNDI.2025.10072133
     
  • Deep learning-aided wireless channel estimation in 6G   Order a copy of this article
    by M. Sivanathan, A. Rajasekar, T. Veena, S. Aswini 
    Abstract: Various processes can facilitate the exchange of information, allowing multiple inputs and outputs. A base station (BS) is essential for a user terminal (UT) to obtain accurate channel state information (CSI). However, pilot signals can only be transmitted a limited number of times and for a specific duration. Due to the dynamic nature of the environment, the position of the UT is constantly changing. Consequently, the number of pilot signals remains constant, necessitating repeated transmission of pilot signals in neighbouring cells. This repetition results in pilot contamination, which complicates the establishment of connections between adjacent cells. The minimal mean square error (MMSE) method can still yield reasonable results, even in the presence of pilot contamination. While the BS has knowledge of the channels linked to each UT, this information is not available to the MMSE method. In this study, two channel estimation approaches are proposed.
    Keywords: deep learning; CSI; channel state information; LS and MMSE; LoS; line-of-sight; IoT; Internet of Things; SNR; signal-to-noise ratio; MMPGA-LS; res net.
    DOI: 10.1504/IJMNDI.2025.10072670