Forthcoming Articles

International Journal of Grid and Utility Computing

International Journal of Grid and Utility Computing (IJGUC)

Forthcoming articles have been peer-reviewed and accepted for publication but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.

Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.

Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.

Online First articles are also listed here. Online First articles are fully citeable, complete with a DOI. They can be cited, read, and downloaded. Online First articles are published as Open Access (OA) articles to make the latest research available as early as possible.

Open AccessArticles marked with this Open Access icon are Online First articles. They are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.

Register for our alerting service, which notifies you by email when new issues are published online.

International Journal of Grid and Utility Computing (27 papers in press)

Regular Issues

  • Recommendation system based on space-time user similarity
    by Wei Luo, Zhihao Peng, Ansheng Deng 
    Abstract: With the advent of 5G, the way people get information and the means of information transmission have become more and more important. As the main platform of information transmission, social media not only brings convenience to people's lives, but also generates huge amounts of redundant information because of the speed of information updating. In order to meet the personalised needs of users and enable users to find interesting information in a large volume of data, recommendation systems emerged as the times require. Recommendation systems, as an important tool to help users to filter internet information, play an extremely important role in both academia and industry. The traditional recommendation system assumes that all users are independent. In this paper, in order to improve the prediction accuracy, a recommendation system based on space-time user similarity is proposed. The experimental results on Sina Weibo dataset show that, compared with the traditional collaborative filtering recommendation system based on user similarity, the proposed method has better performance in precision, recall and F-measure evaluation value.
    Keywords: time-based user similarity; space-based user similarity; recommendation system; user preference; collaborative filtering.

  • Joint end-to-end recognition deep network and data augmentation for industrial mould number recognition   Order a copy of this article
    by RuiMing Li, ChaoJun Dong, JiaCong Chen, YiKui Zhai 
    Abstract: With the booming manufacturing industry, the significance of mould management is increasing. At present, manual management is gradually eliminated owing to need for a large amount of labour, while the effect of a radiofrequency identification (RFID) system is not ideal, which is limited by the characteristics of the metal, such as rust and erosion. Fortunately, the rise of convolutional neural networks (CNNs) brings down to the solution of mould management from the perspective of images that management by identifying the digital number on the mould. Yet there is no trace of a public database for mould recognition, and there is no special recognition method in this field. To address this problem, this paper first presents a novel data set aiming to support the CNN training. The images in the database are collected in the real scene and finely manually labelled, which can train an effective recognition model and generalise to the actual scenario. Besides, we combined the mainstream text spotter and the data augmentation specifically designed for the real world, and found that it has a considerable effect on mould recognition.
    Keywords: mould recognition database; text spotter; mould recognition; data augmentation.

  • University ranking approach with bibliometrics and augmented social perception data   Order a copy of this article
    by Kittayaporn Chantaranimi, Rattasit Sukhahuta, Juggapong Natwichai 
    Abstract: Typically, universities aim to achieve a high position in ranking systems for their reputation. However, self-evaluating rankings could be costly because the indicators are not only from bibliometrics, but also the results of over a thousand surveys. In this paper, we propose a novel approach to estimate university rankings based on traditional data, i.e., bibliometrics, and non-traditional data, i.e., Altmetric Attention Score, and Sustainable Development Goals indicators. Our approach estimates subject-areas rankings in Arts & Humanities, Engineering & Technology, Life Sciences & Medicine, Natural Sciences, and Social Sciences & Management. Then, by using Spearman rank-order correlation and overlapping rate, our results are evaluated by comparing with the QS subject ranking. From the result, our approach, particularly the top-10 ranking, performed estimating effectively and then could assist stakeholders in estimating the university's position when the survey is not available.
    Keywords: university ranking; rank similarity; bibliometrics; augmented social perception data; sustainable development goals; Altmetrics.

  • Assessment of a cuckoo search-based intelligent system for mesh routers placement optimisation in WMNs considering various distributions of mesh clients   Order a copy of this article
    by Shinji Sakamoto 
    Abstract: Wireless Mesh Networks (WMNs) have many advantages. However, they have several issues related to wireless communication. An effective approach to deal with these problems is the optimization of mesh routers placement in WMNs, but this is an NP-hard problem. Thus, heuristic and intelligent algorithms are needed. In the previous work, we developed an intelligent simulation system based on Cuckoo Search (CS) (WMN-CS), which is a meta-heuristic algorithm. In this work, we evaluate the WMN-CS performance for various distributions of mesh clients: Uniform, Normal, Exponential, Weibull and Chi-square distributions. The simulation results show that for Normal distribution the WMN-CS system can find suitable locations of mesh routers for about 30 phases. The Uniform distribution has the lowest performance compared to the other distributions. Also, Exponential andWeibull distributions converge slower than Chi-square distribution. However, Exponential distribution converges faster than Weibull distribution.
    Keywords: wireless mesh networks; node placement problem; cuckoo search; client distributions.
    DOI: 10.1504/IJGUC.2024.10068275
     
  • Analysis of cybersecurity attacks and solution approaches   Order a copy of this article
    by Ali Yılmaz, Resul Das 
    Abstract: Numerous cybersecurity vulnerabilities have emerged as a result of the quickly changing digital environment, posing serious risks to people, corporations, and governments alike The article "An analysis cybersecurity attacks and solution approaches" delves into the subtleties of this contemporary conflict zone and offers a thorough analysis of the many assault vectors, techniques, and their far-reaching effects This article examines the full spectrum of online hazards, from well-known enemies such as malware, phishing, and denial-of-service attacks to more sophisticated and sneaky threats such as advanced persistent threats (APT) It explores the methods and strategies employed by cybercriminals, offering light on their always-changing approaches This article also sheds light on preventive measures and problem-solving techniques created to combat these risks Provides an overview of a wide range of cybersecurity techniques and tools, including artificial intelligence, intrusion detection systems, encryption, and multifactor authentication Individuals and organizations can protect their digital domains from the onslaught of cyberattacks by being aware of the threats and having the necessary tools and knowledge. With a clearer grasp of the cyberthreat landscape and the various solutions to reduce these risks, readers should be better equipped to use the internet safely as a result of this thorough analysis.
    Keywords: information security; cybersecurity; malware; network security; attacks; cyberthreats.
    DOI: 10.1504/IJGUC.2024.10068492
     
  • A two-stage intrusion detection framework in IoT using random forest for binary and multi-class classification   Order a copy of this article
    by Arash Salehpour, Pejman Hosseinioun, MohammadAli Balafar 
    Abstract: The proliferation of IoT devices due to cyber threats, which have become increasingly sophisticated, requires a strong security framework. This paper proposed a new framework for Intrusion Detection System-IoT-IDs using a Random Forest classifier to first classify the attack into binary features and prepare a new dataset that would enable multiclass classification. It achieved an overall accuracy of 0.98 on the comprehensive UNSW-NB15 dataset, with very good performance in detecting 'Generic' attacks, having almost perfect precision, recall, and F1-score. it also presented cases of 'Analysis' and 'Backdoor' types of attacks, where further improvements should be done. All these models have been analyzed to find the pros and cons in IoT settings. The Random Forests, XGBoost and MLP. Further studies based on the research could be done on multiplying models with improved features, intrusion detection in real time, and more strong AI techniques. This paper focuses on addressing challenges with imbalanced classes and scalability concerns using data privacy preservation methods for improving the performance of IDS.
    Keywords: IDS; intrusion detection system; IoT; internet of things; ensemble learning; UNSWNB15; cybersecurity; hybrid models.
    DOI: 10.1504/IJGUC.2024.10071203
     
  • Fractional red panda optimisation-based cluster head selection and routing in IoT   Order a copy of this article
    by Nandkumar Prabhakar Kulkarni, Meena Chavan, Amar Rajendra Mudiraj 
    Abstract: In the present world, the IoT has been developed as a widespread network for various smart devices in numerous applications. IoT is considered a significant technology for achieving the requirements for a variety of applications. However, load balancing, energy inadequate battery power, and security may affect the performance of IoT. Therefore, the Fractional Red Panda Optimization (FrRPO)-based Cluster Head (CH) selection and routing is proposed in this paper. The IoT network simulation is the primary process. The Deep Q Net (DQN) is used for predicting the energy. The FrRPO with fitness factors like predicted energy, delay, and distance are used to select the CH. Moreover, the FrRPO with fitness factors like throughput, energy, distance, and reliability are utilized for routing. The metrics like energy consumption, delay, and throughput are considered to validate the model, which attains the optimal results of 0.555 J, 0.666 s, and 89.02 Mbps.
    Keywords: cluster head; red panda optimisation; internet of things; routing; fractional calculus.
    DOI: 10.1504/IJGUC.2025.10072128
     
  • Comparable IoT and DL methods of drinking water usage   Order a copy of this article
    by Arber Musliu, Naim Baftiu 
    Abstract: IoT applications have actively employed advanced technology, utilizing neural networks to comprehend and connect with their surroundings. Significantly, Amazon Echo exemplifies an IoT application by bridging physical and human realms with the digital domain, employing deep learning for voice command comprehension. Similarly, Microsoft's Windows facial recognition security system integrates DL to unlock doors upon facial recognition. This research explores the integration of IoT with DL techniques to enhance the monitoring and analysis of drinking water quality assessment, evaluating various methods to determine drinkability. Various machine learning algorithms, including Random Forest, LightGBM, and Bagging Classifier, are employed to predict water quality based on multiple parameters such as pH, conductivity, and turbidity. The study uses a comprehensive dataset featuring 3277 values for nine different water quality indicators. Comparative analysis revealed similar outcomes: Random Forest demonstrated the highest accuracy, achieving a predictive accuracy of 0.824695, followed by Light GBM and and Bagging Classifier. This research contributes to the ongoing efforts to employ advanced computational techniques in environmental monitoring, providing a reliable methodological framework for future studies to enhance water quality assessment.
    Keywords: drinking water usage; IoT; DL methods; water monitoring; smart water systems.
    DOI: 10.1504/IJGUC.2024.10072181
     
  • Dynamic application placement and resource optimisation technique for heterogeneous fog computing environments   Order a copy of this article
    by S. Sheela, S. M. Dilip Kumar 
    Abstract: The increase in demand for reducing the latency in service requests of Internet of Things (IoT) applications has led researchers to drift from Cloud computing to Fog computing paradigms. Fog computing brings computing and data storage closer to devices and sensors, reducing latency and improving response time and reliability. However, implementing fog computing successfully requires fog nodes and applications to be deployed effectively to provide high-performance services. In addition, fog computing faces challenges in efficient resource scheduling due to the scarce capacity of fog nodes and the dynamic and heterogeneous nature of devices, leading to complexities in workload allocation and optimal resource utilization. This work presents a simplified model for dynamically placing the application modules in a heterogeneous fog computing environment 1. A new framework for learning scheme is implemented using a Deep Deterministic Policy Gradient (DDPG)-based reinforcement learning technique for predicting the operations and determining the cumulative rewards. A test environment demonstrates that the proposed framework has lower mobility dependency, higher reward, and reduced variance compared to existing schemes.
    Keywords: application placement; cloud computing; bandwidth; deep deterministic policy gradient; dynamic; fog computing; heterogeneous; Markov Model; optimal node placement; resource management.
    DOI: 10.1504/IJGUC.2024.10072572
     
  • Enhancing fruit farming efficiency through IoT-driven soil moisture analysis and classifier ensemble   Order a copy of this article
    by Chinmayee Senapati, Swagatika Senapati, Satyaprakash Swain 
    Abstract: Effective soil moisture management is a key to optimising fruit farming productivity. This research work model integrates IoT devices, clustering techniques, PCA and ensemble learning to enhance soil moisture classification for crops like pomegranate, mulberry, mango, grapes, ragi and potato. Among various classifiers tested, the Random Forest model demonstrated superior accuracy. However, a stacked Random Forest-SVM model further improved accuracy to 94.65%. This research underscores the importance of IoT-driven data and machine learning in precision agriculture, demonstrating how advanced techniques can refine soil moisture management. By optimising soil moisture, we directly enhance nutrient availability, root development, and water uptake, leading to better crop yield, quality and sustainability. This approach highlights the synergy between technology and machine learning, advancing sustainable and efficient fruit cultivation.
    Keywords: IoT; PCA; NBM; RBF; K-star; RF; SVM; KNN; GNB; DT; clustering.
    DOI: 10.1504/IJGUC.2024.10066245
     
  • Effort forecasting approach for new generation agile projects   Order a copy of this article
    by Anupama Kaushik, Prabhjot Kaur, Tirimula Rao Benala, Muhammad Altaf Ahmed 
    Abstract: Now-a-days many software companies are undertaking agile software development along with traditional software development based on waterfall model. Effort estimation is an important constituent of any software development firm as it can measure the success rate of a software project. In this paper, an analogy-based model for effort estimation is proposed for agile projects. The model is evaluated on three agile data sets and evaluation criteria used are Mean Magnitude of Relative Error (MMRE), Median magnitude of relative error (MdMRE) and prediction (PRED). The results demonstrated that the analogy-based model for effort estimation which was extensively used for traditional software development works well with the projects developed using agile methodologies also. Two statistical tests are also performed, Friedman's test and Holm Post-Hoc test based upon the MMRE values. The results of these tests communicated that analogy-based effort estimation can be done for the agile projects as well.
    Keywords: software development effort; analogy estimation; agile projects; agile methodologies; agile effort; similarity measures; Zia Dataset; CD-1 Dataset; CD-2 Dataset.
    DOI: 10.1504/IJGUC.2024.10069592
     
  • Optimising DNNs for load forecasting: the power of hyperparameter tuning   Order a copy of this article
    by Faisal Mehmood Butt, Seong-O Shim, Safa Habibullah, Abdulwahab Ali Almazroi, Lal Hussain, Umair Ahmad Salaria 
    Abstract: This study investigated the effectiveness of deep learning for electricity demand forecasting across different timescales. For one-day forecasts, a double hidden layer network with Rectified Linear Unit (ReLU) and sigmoid activation functions achieved the lowest Mean Absolute Percentage Error (MAPE) of 4.23%, requiring only four neurons in the hidden layers. Longer timescales necessitated more complex architectures. The one-month forecast achieved a MAPE of 2.78% with a double ReLU-sigmoid network and 12 neurons in the hidden layers. Even challenging three-month forecasts were tackled effectively by a double ReLU network with ten neurons, resulting in a MAPE of 2.75%. These findings highlight a crucial point: capturing the non-linearity and dynamic nature of long-term forecasts requires more intricate network designs, utilising a strategic selection of activation functions and enough neurons. By optimising network architecture, we can ensure the electricity grid adapts to meet demand fluctuations, optimise resource allocation, and even facilitate future planning.
    Keywords: optimisation; ReLU; rectified linear unit; convolutional neural; networks; deep neural networks; signature function; neurons; layers.
    DOI: 10.1504/IJGUC.2024.10067738
     
  • Leveraging a hybrid whale-grey wolf optimisation algorithm to enhance fifth generation deployment efficiency in multi-access edge computing   Order a copy of this article
    by B. Senthilkumar, U. Barakkath Nisha, V. Kalpana, Priscilla Joy, S. Gokila 
    Abstract: In contemporary advanced manufacturing environments, deploying numerous intelligent mobile devices is crucial to meet the rising need for adaptable production capabilities. This research article focuses on addressing critical challenges in modern connected environments where intelligent mobile devices play a crucial role. These devices are equipped with a variety of sensors and constantly synchronise massive data sets. Furthermore, with the increasing importance of energy efficiency, current studies highlight energy use as a substantial expense. Nevertheless, previous research has mostly focused on analysing energy consumption in cloud servers, neglecting to include the energy usage associated with edge computing and underestimating the influence of various mobile devices. This becomes further crucial as interconnected settings increase. The research article presents an integer programming paradigm that addresses the difficulty of efficiently deploying MEC servers and fifth generation (5G) small cells. Given the NP-hard nature of edge server deployment, the research article proposes a novel Hybrid Whale-Grey Wolf Optimisation (HWGWO) algorithm. This metaheuristic algorithm combines the global exploration capabilities of WOA with the local search efficiency of GWO, thereby achieving a balanced and efficient search process.
    Keywords: MEC; multi-access edge computing; HWGWO; hybrid whale-grey wolf optimisation.
    DOI: 10.1504/IJGUC.2024.10069979
     
  • Combining classifier ensembles for efficient big data analytics in edge-cloud environments   Order a copy of this article
    by Sirisha Potluri, Khasim Syed, Santi Swarup Basa, J. Kavitha Reddy, P. Pavan Kumar 
    Abstract: Big Data analytics is essential for modern organisations because of the exponential expansion of data from various sources such as social media, sensors, mobile devices and the Internet of Things. By examining this data, organisations can learn a great deal about consumer behaviour, industry trends, operational inefficiencies and creative potential. In our work, we are proposing an efficient ensemble classification model by using metaheuristic optimisation algorithms - Chaotic Pigeon Inspired Optimisation (CPIO), Random Forest (RF) and Support Vector Machine (SVM). Our proposed model performs better feature selection and assigns appropriate class labels for the given cloud service data. This ensemble model supports the classification process and boosts the model's performance. We have performed a series of simulation operations to observe the outcomes under several dimensions and parameters. The resultant outcomes emphasised the efficiency of the proposed model over the recent practices in edge-cloud computing platforms.
    Keywords: machine learning; artificial intelligence; cloud computing; edge computing; ensemble classification; big data analytics.
    DOI: 10.1504/IJGUC.2024.10070219
     
  • Secure and privacy-preserving medical image retrieval model in cloud-edge computing   Order a copy of this article
    by Sirisha Potluri, R. Balamurali, B. Ravikrishna 
    Abstract: Content-based image retrieval is one of the significant image retrieval techniques. There is a challenging task for content-based image retrieval due to the risk of privacy concerns, leakage of sensitive data and encryption of the images before keeping them in the cloud. In this work, we propose a secure and privacy-preserving medical image retrieval model to support multiple image sources or owners and their sharing concerns. We securely encrypt the image features with a multi-party computation and sharing model with their keys. This allows us to achieve efficient image retrieval over several images that are collected from multiple sources with guaranteed image privacy. Using this technique, we can also get similarity measurements of the images to avoid keeping image similarity information in the cloud. Results demonstrate that our proposed model achieved improved image retrieval accuracy while preserving the security and privacy of the image.
    Keywords: content-based image retrieval; healthcare; computer vision; cloud computing; edge computing; encryption.
    DOI: 10.1504/IJGUC.2024.10070218
     
  • An architectural framework for interoperable security anomaly detection in edge computing and SDN using gradient-boosted trees   Order a copy of this article
    by R. Gowrishankar, V.J. Arulkarthick, A.P. Janani, K. Sundaresan, U. Barakkath Nisha, R. Yasir Abdullah, Thrivikram Bathini 
    Abstract: In the ever-changing environment of edge computing and Software-Defined Networking (SDN), it is crucial to have strong security measures in place to protect against new threats and ensure compatibility between different systems. Conventional anomaly detection algorithms frequently have difficulties in dealing with the intricacy and dynamic characteristics of such situations. This paper presents a specialised architectural framework designed for edge computing and SDN, utilising the sophisticated capabilities of Gradient-Boosted Trees (GBT) to achieve efficient anomaly detection. Our framework aims to improve security in edge computing and SDN infrastructures by utilising the predictive capabilities and adaptability of GBT. GBT offers enhanced precision in identifying security anomalies when compared to conventional approaches by examining extensive and diverse data streams prevalent in such contexts. The effectiveness of the suggested architecture in simulated edge computing and SDN systems is proven through empirical evaluations. The results demonstrate substantial enhancements in the detection of anomalies and a decrease in false positive occurrences, underscoring the capability of GBT in strengthening security within intricate and ever-changing ecosystems. This research enhances security solutions specifically designed for edge computing and SDN environments. The suggested architecture establishes the foundation for future research and development endeavours focused on tackling the changing security challenges in edge computing and SDN domains.
    Keywords: edge computing; SDN; software-defined networking; security anomaly detection; GBT; gradient-boosted trees; interoperability; framework architecture.
    DOI: 10.1504/IJGUC.2024.10069455
     
  • Low-powered IoT device for monitoring dementia patients using A9G module and ESP32 C3 MCU in cloud environment   Order a copy of this article
    by Kumar Saurabh, Manish Madhava Tripathi, Satyasundara Mahapatra 
    Abstract: As the world's population ages, more and more people are affected by a disease known as 'dementia' that causes gradual mental decline. Owing to this, not only the patient but also the caregiver's personal life and health system face challenges. To keep the system smart, an intelligent monitoring system is needed. Development of an optimised IoT device with the latest sensors and micro-controllers which stores the data in cloud environment in a systematic manner is the basic objective. This research presents a methodology which optimises a normal monitoring device into advanced IoT system using A9G Module and ESP32 C3 MCU. After the optimisation, the evaluation of the device is based on various qualitative parameters and overall effectiveness in real-world applications. The findings suggest that while simple IoT devices offer indoor monitoring, the proposed device provide comprehensive monitoring which enhance the patient's safety and the caregiver's peace of mind.
    Keywords: dementia; efficient utilisation; IoT-based monitoring; patient care; data analysis; privacy; wearable devices.
    DOI: 10.1504/IJGUC.2024.10067985
     
  • An exploration of edge-based energy harvesting and routing strategies to enhance communication efficiency in IoT networks   Order a copy of this article
    by B. Senthilkumar, Prashant Bachanna, S. Chandragandhi, U. Barakkath Nisha, M. Sravani, S. Swathi, P. Penchala Prasanth, R. Yasir Abdullah 
    Abstract: This article explores the domain of Internet of Things (IoT) sensor networks, with a specific emphasis on improving sustainability and communication efficiency. Given the widespread use of interconnected devices, it is crucial to prioritise sustainable functioning and effective communication. Energy harvesting is becoming a promising approach for overcoming power limitations, allowing sensors to gather energy from their environment. Furthermore, routing algorithms have a crucial function in optimising communication efficiency inside these networks. The aim of this paper is to investigate various methods of harnessing energy and routing algorithms to discover novel approaches that promote sustainability and improve communication performance in IoT sensor networks. This study aims to offer significant insights and recommendations for creating resilient and environmentally friendly IoT solutions through a thorough analysis. The study assesses a range of energy acquisition techniques, encompassing solar, kinetic, thermal and radio frequency acquisition. Every solution has distinct benefits and difficulties, requiring a detailed evaluation of their appropriateness for implementing IoT sensors in different situations. In addition, the examination includes the study of routing algorithms, which are crucial for ensuring efficient data transfer and network stability. The goal is to determine the most effective methods for improving communication performance while minimising energy use. This study aims to enhance the development of sustainable and resilient IoT sensor networks by combining knowledge from the areas of energy harvesting and routing.
    Keywords: IoT sensor networks; energy harvesting; routing algorithms; sustainability; communication efficiency; renewable energy.
    DOI: 10.1504/IJGUC.2024.10069951
     
  • Reinforcing blockchain security with advanced anomaly detection in edge computing using EDGESENTRY   Order a copy of this article
    by B. Senthilkumar, Prashant Bachanna, U. Barakkath Nisha, M. Sravani, S. Swathi, K. Srihari, Thrivikram Bathini, R. Yasir Abdullah 
    Abstract: Within the realm of distributed systems, the fusion of blockchain technology and edge computing has generated considerable attention, offering the potential for improved security and efficiency in decentralised applications. This article presents EDGESENTRY, an abbreviation for 'Enhanced Detection for Genuine Edge Security', a novel anomaly detection technique designed exclusively for blockchain-integrated edge computing settings. A comparison between EDGESENTRY and currently available anomaly detection algorithms developed for edge computing were conducted. This comparison focuses on highlighting the distinctive characteristics and performance benefits of EDGESENTRY. By conducting thorough experimentation and assessment, which includes comparing against the most advanced methodologies, the efficiency of EDGESENTRY in identifying and reducing irregularities in blockchain transactions that occur at the outer edges of the network was calculated. This research article emphasises the importance of EDGESENTRY in strengthening the security of blockchain in edge computing environments. Furthermore, the article examines the consequences for future investigation and advancement, including possible improvements to the algorithm and study of new uses in emerging fields. EDGESENTRY tackles the difficulties of identifying abnormal behaviour in decentralised systems, hence enabling the development of more secure and robust edge computing infrastructures at the time of blockchain advancements.
    Keywords: anomaly detection; blockchain security; edge computing; EDGESENTRY; reliability.
    DOI: 10.1504/IJGUC.2024.10069049
     
  • LIDS security shield: lightweight intrusion detection system (IDS) for detecting DDoS attacks on IoT environment   Order a copy of this article
    by Sai Suchith Kurra, Likhitha Rayana, Sai Anurag Ayyalasomayajula, Mohamed Sirajudeen Yoosuf 
    Abstract: The proliferation of edge computing and the Internet of Things (IoT) has raised serious concerns about linked devices' susceptibility to Distributed Denial of Service (DDoS) attacks. The new lightweight Intrusion Detection System (IDS) model proposed in this study is intended only for the detection of DDoS assaults against edge and Internet of Things devices. Because they frequently use a lot of resources, traditional IDS solutions are inappropriate for resource-constrained IoT and edge computing settings. With minimal computing overhead, the suggested methodology uses anomaly detection and machine learning techniques to find patterns linked to DDoS assaults. In addition, the model considers the particularities of IoT and edge devices, considering things like constrained computing power, energy limitations and communication protocols. The IDS model's lightweight design ensures its applicability to resource-constrained devices, making it an invaluable tool for bolstering security in the rapidly expanding IoT and edge computing ecosystems.
    Keywords: IoT; DDoS attack; edge device; cyber-security; light-weight IDS system.
    DOI: 10.1504/IJGUC.2024.10069499
     
  • Innovative edge computing-driven recognition of cardiac monitoring application   Order a copy of this article
    by R. Kishore Kanna, Rajendar Sandiri, Biswajit Brahma, Bhawani Sankar Panigrahi, Susanta Kumar Sahoo, Rajitha Ala 
    Abstract: Cardiac disease has been emerged as a significant global health concern. The mortality rate for abrupt cardiac arrest is high. Continuous monitoring of subject's physical condition has the potential to save up to 60% of human lives. The IoT executes well in the remote monitoring of a patient's status. The widespread utilisation of wearable devices will enhance the impact of the Internet of Things (IoT). This study evaluates five prediction models for specific data and selects the random forest algorithm as the best method in edge computing applications. Suggests MAX30102 for strong plan is equivalent to patient's assessment, their remains patient vitals and parameters and they are the matrix. The user authentication will be loaded on the database. The selected predictive modelling approach analyses critical information for any anomalies and alert messages based on abnormality data detection in heartbeat are transmitted via ThingSpeak cloud to the nearest healthcare facility.
    Keywords: computing; edge-computing; cardiac; healthcare.
    DOI: 10.1504/IJGUC.2024.10068165
     
  • SHO: an efficient multi-objective task scheduling in hybrid cloud-fog computing environment   Order a copy of this article
    by Santhosh Kumar Medishetti, Ganesh Reddy Karri 
    Abstract: Cloud-Fog Computing (CFC) has emerged as a solution to meet the growing demand for efficient processing in diverse applications. Task Scheduling (TS) within CFC is crucial for resource optimisation and meeting user needs. This paper presents a novel approach to TS using the Spotted Hyena Optimisation (SHO) algorithm, drawing inspiration from hyena pack dynamics. SHO aims to minimise makespan, energy consumption and execution time while enhancing throughput. The study extensively evaluates SHO's adaptability to dynamic workloads and varying user demands through simulations. Results demonstrate SHO's superior performance over existing algorithms, highlighting its potential to enhance CFC system efficiency and reliability. The proposed algorithm is effectively balances the makespan time by 13.69%, execution time by 26.06%, energy saving by 30.99% and increased throughput by 14.59% over three existing advanced scheduling algorithms. This research contributes to the ongoing optimisation of TS in CFC, offering valuable insights into nature-inspired algorithms' effectiveness in addressing complex computational challenges.
    Keywords: task scheduling; cloud-fog computing; spotted hyena optimisation; makespan; execution time; energy consumption.
    DOI: 10.1504/IJGUC.2024.10070065
     
  • Generative AI in IoT: transforming cloud services with intelligent automation   Order a copy of this article
    by Ashwin Raiyani, Sheetal Pandya, Kaushal Jani, Zalak Vyas 
    Abstract: This paper explores how Generative AI can revolutionise cloud services, enabling smart automation over the Internet of Things (IoT). It outlines the potential use of Generative AI in smart homes, industrial automation, healthcare and transportation, proposing an architecture that integrates cloud computing, edge computing and IoT computing for Generative AI. The integration allows IoT devices to assess data, make independent decisions and manage themselves based on customised user experiences and functionality. Case studies and experimental evaluations demonstrate significant productivity, efficiency and user satisfaction improvements. However, challenges such as data heterogeneity, security issues and ethical considerations must be addressed for reliable AI-enabled IoT applications. Collaboration with academicians, business experts and governmental representatives is suggested to build trustworthy spaces for integrating AI capabilities into IoT cloud services, leveraging the joint effort between Gen AI and IoT for innovation, productivity and competitiveness.
    Keywords: generative AI; internet of things; cloud services; intelligent automation; AI-Driven IoT.
    DOI: 10.1504/IJGUC.2024.10069892
     
  • Multi-objective enhanced task scheduling algorithm for SLA violation mitigation in cloud computing using deep reinforcement learning   Order a copy of this article
    by Mallu Shiva Rama Krishna, Khasim Vali Dudekula 
    Abstract: Task scheduling in distributed heterogeneous cloud environments presents challenges due to NP-hard complexity, non-linear dynamics, and multi-objective optimisation requirements. To address these challenges, we propose an intelligent scheduling framework that combines real-time workload monitoring with predictive task analysis using Deep Q-Network (DQN) reinforcement learning. Our adaptive solution continuously optimises resource allocation through experiential learning, improving system performance while reducing energy consumption and minimising SLA violations in dynamic cloud environments. We evaluate the Multi-Objective Enhanced Task Scheduling Algorithm for SLA Violation Mitigation in Cloud Computing using Deep Reinforcement Learning (METCD) against baseline algorithms, including WOA, CA, GWO, ALO, and GA. The results demonstrate that METCD outperforms existing approaches, reducing makespan by 28.76% to 40.03%, SLA violations by 29.59% to 47.12%, and energy consumption by 32.82% to 42.2%. These improvements emphasise the efficiency of METCD in optimising task scheduling within cloud computing environments.
    Keywords: SLA violation; cloud computing; task scheduling; energy consumption; machine learning; makespan; deep Q-learning.
    DOI: 10.1504/IJGUC.2025.10071740
     

Special Issue on: AMLDA 2022 Applied Machine Learning and Data Analytics Applications, Challenges, and Future Directions

  • Fuzzy forests for feature selection in high-dimensional survey data: an application to the 2020 US Presidential Election   Order a copy of this article
    by Sreemanti Dey, R. Michael Alvarez 
    Abstract: An increasingly common methodological issue in the field of social science is high-dimensional and highly correlated datasets that are unamenable to the traditional deductive framework of study. Analysis of candidate choice in the 2020 Presidential Election is one area in which this issue presents itself: in order to test the many theories explaining the outcome of the election, it is necessary to use data such as the 2020 Cooperative Election Study Common Content, with hundreds of highly correlated features. We present the fuzzy forests algorithm, a variant of the popular random forests ensemble method, as an efficient way to reduce the feature space in such cases with minimal bias, while also maintaining predictive performance on par with common algorithms such as random forests and logit. Using fuzzy forests, we isolate the top correlates of candidate choice and find that partisan polarisation was the strongest factor driving the 2020 Presidential Election.
    Keywords: fuzzy forests; machine learning; ensemble methods; dimensionality reduction; American elections; candidate choice; correlation; partisanship; issue voting; Trump; Biden.

  • An efficient intrusion detection system using unsupervised learning AutoEncoder   Order a copy of this article
    by N.D. Patel, B.M. Mehtre, Rajeev Wankar 
    Abstract: As attacks on the network environment are rapidly becoming more sophisticated and intelligent in recent years, the limitations of the existing signature-based intrusion detection system are becoming more evident. For new attacks such as Advanced Persistent Threat (APT), the signature pattern has a problem of poor generalisation performance. Research on intrusion detection systems based on machine learning is being actively conducted to solve this problem. However, the attack sample is collected less than the normal sample in the actual network environment, so it suffers a class imbalance problem. When a supervised learning-based anomaly detection model is trained with these data, the results are biased toward normal samples. In this paper, AutoEncoder (AE) is used to perform single-class anomaly detection to solve this imbalance problem. The experimental evaluation was conducted using the CIC-IDS2017 dataset, and the performance of the proposed method was compared with supervised models to evaluate the performance
    Keywords: intrusion detection system; advanced persistent threat; CICIDS2017; AutoEncoder; machine learning; data analytics.

Special Issue on: Cloud and Fog Computing for Corporate Entrepreneurship in the Digital Era

  • Study on the economic consequences of enterprise financial sharing model   Order a copy of this article
    by Yu Yang, Zecheng Yin 
    Abstract: Using enterprise system ideas to examine the business process requirements of firms, the Financial Enterprise Model (FEM) is a demanding program. This major integrates finance, accounting, and other critical business processes. Conventional financial face difficulties due to low economic inclusion, restricted access to capital, lack of data, poor R&D expenditures, underdeveloped distribution channels, and so on. This paper mentions making, consuming, and redistributing goods through collaborative platform networks. These three instances highlight how ICTs (Information and Communication Technologies) can be exploited as a new source of company innovation. The sharing economy model can help social companies solve their market problems since social value can be embedded into their sharing economy cycles. As part of the ICT-based sharing economy, new business models for social entrepreneurship can be developed by employing creative and proactive platforms. Unlike most public organizations, double-bottom-line organizations can create social and economic advantages. There are implications for developing and propagating societal values based on these findings.
    Keywords: finance; economy; enterprise; ICT; social advantage.