Forthcoming and Online First Articles

International Journal of Grid and Utility Computing

International Journal of Grid and Utility Computing (IJGUC)

Forthcoming articles have been peer-reviewed and accepted for publication but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.

Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.

Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.

Online First articles are published online here, before they appear in a journal issue. Online First articles are fully citeable, complete with a DOI. They can be cited, read, and downloaded. Online First articles are published as Open Access (OA) articles to make the latest research available as early as possible.

Open AccessArticles marked with this Open Access icon are Online First articles. They are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.

Register for our alerting service, which notifies you by email when new issues are published online.

We also offer which provide timely updates of tables of contents, newly published articles and calls for papers.

International Journal of Grid and Utility Computing (44 papers in press)

Regular Issues

  • Research on modelling analysis and maximum power point tracking strategies for distributed photovoltaic power generation systems based on adaptive control technology   Order a copy of this article
    by Yan Geng, Jianwei Ji, Bo Hu, Yingjun Ju 
    Abstract: As is well-known, the distributed photovoltaic power generation technology has been rapidly developed in recent years. The cost of distributed photovoltaic power generation is much higher than that of traditional power generation modes. Therefore, how to improve the effective use of photovoltaic cells has become a popular research direction. Based on the analysis of the characteristics of photovoltaic cells, this paper presents a mathematical model of photovoltaic cells and a maximum power point tracking algorithm based on hysteresis control and adaptive control technology variable step perturbation observation method. This algorithm can balance the control precision and control speed from the disturbance observation method and improve the tracking results significantly. Finally, the feasibility of the algorithm and the tracking effects are simulated by using Matlab/Simulink software.
    Keywords: distributed photovoltaic; adaptive control technology; maximum power point tracking strategies.

  • SDSAM: a service-oriented approach for descriptive statistical analysis of multidimensional spatio-temporal big data   Order a copy of this article
    by Weilong Ding, Zhuofeng Zhao, Jie Zhou, Han Li 
    Abstract: With the expansion of the Internet of Things, spatio-temporal data has been widely used and generated. The rise of big data in space and time has led to a flood of new applications with statistical analysis characteristics. In addition, applications based on statistical analysis of these data must deal with the large capacity, diversity and frequent changes of data, as well as the query, integration and visualisation of data. Developing such applications is essentially a challenging and time-consuming task. In order to simplify the statistical analysis of spatio-temporal data, a service-oriented method is proposed in this paper. This method defines the model of spatio-temporal data service and functional service. It defines a process-based application of spatio-temporal big data statistics to invoke basic data services and functional services, and proposes an implementation method of spatio-temporal data service and functional service based on Hadoop environment. Taking the highway big data analysis as an example, the validity and applicability of this method are verified. The effectiveness of this method is verified by an example. The validity and applicability of the method are verified by a case study of Expressway large data analysis. An example is given to verify the validity of the method.
    Keywords: spatio-temporal data; RESTful; web service.

  • Research on integrated energy system planning method considering wind power uncertainty   Order a copy of this article
    by Yong Wang, Yongqiang Mu, Jingbo Liu, Yongji Tong, Hongbo Zhu, Mingfeng Chen, Peng Liang 
    Abstract: With the development of energy technology, the planning and operation of integrated energy systems coupled with electricity-gas-heat energy has become an important research topic in the future energy field. In order to solve the influence of wind power uncertainty on the unified planning of integrated energy systems, this paper constructs a wind energy uncertainty quantitative model based on intuitionistic fuzzy sets. Based on this, an integrated energy system planning model with optimal economic cost and environmental cost is established. The model is solved by the harmonic search algorithm. Finally, the proposed method is validated by simulation examples. The effectiveness of the integrated energy system planning method can improve the grid capacity of the wind power and reduce the CO2 of the system. And it has guiding significance for the long-term planning of integrated energy systems
    Keywords: wind power uncertainty; planning method; electricity-gas-heat energy.

  • A privacy-aware and fair self-exchanging self-trading scheme for IoT data based on smart contract   Order a copy of this article
    by Yuling Chen, Hongyan Yin, Yaocheng Zhang, Wei Ren, Yi Ren 
    Abstract: With the development of the era of big data, the demand for data sharing and usage is increasing, especially in the era of Internet of things, thus putting forward a keen demand for data exchanging and data trading. However, the existing data exchanging and trading platforms are usually centralized and usersrnhave to trust platforms. This paper proposes a secure and fair exchanging and trading protocol based on blockchain and smart contract, especially, self-governance without relying centralized trust. By using the protocol, it can guarantee fairness to defend against trade cheating, and security for data confidentiality. It can also guarantee efficiency by transferring data links instead of data between data owners and data buyers. The extensive analysisrnjustified that the proposed scheme can facilitate the self-exchanging and self-trading for big data in a secure, fair and efficient manner.
    Keywords: big data; IoT; fair exchanging; blockchain; smart contract; oblivious protocol; fair trading.

  • Micro-PaaS fog: container based orchestration for IoT applications using SBC   Order a copy of this article
    by Walter D.O. Santo, Rubens De Souza Matos Júnior, Admilson De Ribamar Lima Ribeiro, Danilo Souza Silva, Reneilson Yves Carvalho Santos 
    Abstract: The Internet of Things (IoT) is an emerging technology paradigm in which ubiquitous sensors monitor physical infrastructures, environments, and people in real-time to help in decision making and improve the efficiency and reliability of the systems, adding comfort and life quality to society. In this sense, there are questions concerning the limitation of computational resources, high latency and different QoS requirements related to IoT that move cloud technologies to the fog computing direction, and the adoption of light virtualised solutions, as technologies based in containers to attend to many needs of different domains. This work, therefore, has as its goal to propose and implement a micro-Paas architecture for fog computing, in a cluster of single-board computers (SBC), for orchestration of applications using containers, applied to IoT and that attend to the QoS criteria, e.g. high availability, scalability, load balance, and latency. From this proposed model, the micro-Paas fog was implemented with virtualisation technology in containers using orchestration services in a cluster built with Raspberry Pi to monitor water and energy consumption at a total cost of property equivalent to 23% of a public platform as a service (PaaS).
    Keywords: fog computing; cluster; orchestration; containers; single board computing.

  • Anomaly detection against mimicry attacks based on time decay modelling   Order a copy of this article
    by Akinori Muramatsu, Masayoshi Aritsugi 
    Abstract: Because cyberattackers attempt to cheat anomaly detection systems, it is required to make an anomaly detection system robust against such attempts. We focus on mimicry attacks and propose a system to detect such attacks in this paper. Mimicry attacks make use of ordinary operations in order not to be detected. We take account of time decay in modelling operations to give lower priorities to preceding operations, thereby enabling us to detect mimicry attacks. We empirically evaluate our proposal with varying time decay rates to demonstrate that our proposal can detect mimicry attacks that could not be detected by a state-of-the-art anomaly detection approach.
    Keywords: anomaly detection; mimicry attacks; time decay modelling; stream processing.

  • A cloud-based spatiotemporal data warehouse approach   Order a copy of this article
    by Georgia Garani, Nunziato Cassavia, Ilias Savvas 
    Abstract: The arrival of the big data era introduces new necessities for accommodating data access and analysis by organisations. The evolution of data is three-fold, increase in volume, variety, and complexity. The majority of data nowadays is generated in the cloud. Cloud data warehouses enhance the benefits of the cloud by facilitating the integration of cloud data in the cloud. A data warehouse is developed in this paper, which supports both spatial and temporal dimensions. The research focuses on proposing a general design for spatiobitemporal objects implemented by nested dimension tables using the starnest schema approach. Experimental results reflect that the parallel processing of such data in the cloud can process OLAP queries efficiently. Furthermore, increasing the number of computational nodes significantly reduces the time of query execution. The feasibility, scalability, and utility of the proposed technique for querying spatiotemporal data is demonstrated.
    Keywords: cloud computing; big data; hive; business intelligence; data warehouses; cloud based data warehouses; spatiotemporal data; spatiotemporal objects; starnest schema; OLAP; online analytical processing.

  • Recommendation system based on space-time user similarity
    by Wei Luo, Zhihao Peng, Ansheng Deng 
    Abstract: With the advent of 5G, the way people get information and the means of information transmission have become more and more important. As the main platform of information transmission, social media not only brings convenience to people's lives, but also generates huge amounts of redundant information because of the speed of information updating. In order to meet the personalised needs of users and enable users to find interesting information in a large volume of data, recommendation systems emerged as the times require. Recommendation systems, as an important tool to help users to filter internet information, play an extremely important role in both academia and industry. The traditional recommendation system assumes that all users are independent. In this paper, in order to improve the prediction accuracy, a recommendation system based on space-time user similarity is proposed. The experimental results on Sina Weibo dataset show that, compared with the traditional collaborative filtering recommendation system based on user similarity, the proposed method has better performance in precision, recall and F-measure evaluation value.
    Keywords: time-based user similarity; space-based user similarity; recommendation system; user preference; collaborative filtering.

  • Design and analysis of novel hybrid load-balancing algorithm for cloud data centres   Order a copy of this article
    by Ajay Dubey, Vimal Mishra 
    Abstract: In recent the pandemic scenario there is a paradigm shift, from traditional computing to internet-based computing. Now is the time to store and compute the data in the cloud environment. The Cloud Service Providers (CSPs) establish and maintain a huge shared pool of computing resources that provide scalable and on-demand services around the clock without geographical restrictions. The cloud customers are able to access the services and pay according to the accession of resources. When millions of users across the globe connect to the cloud for their storage and computational needs, there might be issues such as delay in services. This problem is associated with load balancing in cloud computing. Hence, there is a need to develop effective load-balancing algorithms. The Novel Hybrid Load Balancing (NHLB) algorithm proposed in this paper manages the load of the virtual machine in the data centre. This paper is focused on certain problems such as optimisation of performance, maximum throughput, minimisation of makespan, and efficient resource use in load balancing. The NHLB algorithm is more efficient than conventional load-balancing algorithms with reduced completion time (makespan) and response time. This algorithm equally distributes the tasks among the virtual machines on the basis of the current state of the virtual machines and the task time required. The paper compares the result of proposed NHLB algorithm with dynamic load-balancing algorithm and honeybee algorithm. The result shows that the proposed algorithm is better than the dynamic and honeybee algorithms.
    Keywords: cloud computing; data centre; load balancing; virtual machine; makespan; performance optimisation.

  • The role of smartphone-based social media capabilities in building social capital, trust, and credibility to engage consumers in eWOM: a social presence theory perspective   Order a copy of this article
    by Saqib Mahmood, Ahmad Jusoh, Khalil MD Nor 
    Abstract: Smartphone-based social media has become a well-established channel for users to develop and maintain intimate social relations that enable them to engage in brands-related information exchange, regardless of their time and location, such as eWOM. Nevertheless, little is known about the essential elements of smartphone-based social media that help consumers to develop intimate social relationships and engage them in eWOM. To this end, drawing on the theory of social presence, the current study develops a research model that proposes that interactivity and media richness enhance social presence, giving consumers a sense of psychological proximity. Subsequently, it leads to the development of trust and social capital bonding and bridging. As a result of the bridging and bonding of social capital, consumers' perceived credibility is expected to enable them to engage in eWOM. To empirically investigate the theoretical model, a survey of 407 smartphone-based social media users was conducted in Pakistan. Empirical results reveal that the interactivity and media richness enhance the social presence that proffers consumers' psychological proximity to developing trust and social capital, further enhancing their perceived credibility to engage in eWOM. Discussions, implications, and future directions on the results are described in the final section of the study.
    Keywords: interactivity; media richness; social presence; psychological proximity; social capital; eWOM; trust; smartphone; social media.

  • Cloud infrastructure planning considering the impact of maintenance and self-healing routines over cost and dependability attributes   Order a copy of this article
    by Carlos Melo, Jean Araujo, Jamilson Dantas, Paulo Pereira, Felipe Oliveira, Paulo Maciel 
    Abstract: Cloud computing is the main trend regarding internet service provision. This paradigm, which emerged from distributed computing, gains more adherents every day. For those who provide or aim at providing a service or a private infrastructure, much has to be done, costs related to acquisition and implementation are common, and an alternative to reduce expenses is to outsource maintenance of resources. Outsourcing tends to be a better choice for those who provide small infrastructures than to pay some employees monthly to keep the service life cycle. This paper evaluates infrastructure reliability and the impact of outsourced maintenance over the availability of private infrastructures. Our baseline environments focus on blockchain as a service; however, by modelling both service and maintenance routines, this study can be applied to most cloud services. The maintenance routines evaluated by this paper encompass a set of service level agreements and some particularities related to reactive, preventive, and self-healing methods. The goal is to point out which one has the best cost-benefit for those with small infrastructures, but that still plans to provide services over the internet. Preventive and self-healing repair routines provided a better cost-benefit solution than traditional reactive maintenance routines, but this scenario may change according to the number of available resources that the service provider has.
    Keywords: maintenance; reliability; availability; modelling; cloud Ccmputing; blockchain; container; services; SLA.

  • Edge computing and its boundaries to IoT and Industry 4.0: a systematic mapping study   Order a copy of this article
    by Matheus Silva, Vinícius Meyer, Cesar De Rose 
    Abstract: In the last decade, cloud computing transformed the IT industry, allowing companies to execute many services that require on-demand availability of computational resources with more flexible provisioning and cost models, including the processing of already growing data volumes. But in the past few years, other technologies such as internet of things and the digitised industry known as Industry 4.0 have emerged, increasing data generation even more. The large amounts of data produced by user-devices and manufacturing machinery have made both industry and academia search for new approaches to process all this data. Alternatives to the cloud centralised processing model and its inherent high latencies have been studied, and edge computing is being proposed as a solution to these problems. This study presents a preliminary mapping of the edge computing field, focusing on its boundaries to the internet of things and Industry 4.0. We began with 219 studies from different academic databases, and after the classification process, we mapped 90 of them in eight distinct edge computing sub-areas and nine categories based on their main contributions. We present an overview of the studies on the edge computing area, which evidences the main concentration sub-areas. Furthermore, this study intends to clarify the remaining research gaps and the main challenges faced by this field, considering the internet of things and Industry 4.0 demands.
    Keywords: edge computing; internet of things; Industry 4.0; systematic mapping.

  • Data collection in underwater wireless sensor networks: performance evaluation of FBR and epidemic routing protocols for node density and area shape   Order a copy of this article
    by Elis Kulla 
    Abstract: Data collection in Underwater Wireless Sensor Networks (UWSN) is not a trivial problem, because of unpredictable delays and unstable links between underwater devices. Moreover, when nodes are mobile, continuous connectivity is not guaranteed. Therefore, data collection in UWSN Node scarcity and movement patterns create different environments for data collection in underwater communication. In this paper, we investigate the impact of the area shape and node density in UWSN, by comparing Focused Beam Routing (FBR) and Epidemic Routing (ER) protocols. Furthermore, we also analyse the correlation between different performance metrics. From simulation results we found that when using FBR, delay and delivery probability slightly decrease (2.1%) but the overhead ratio decreases noticeably (46.9%). The correlation between performance metrics is stronger for square area shape, and is not noticeable for deep area shape.
    Keywords: underwater wireless sensor networks; focused beam routing; delay tolerant network; area shape; node density; data collection.

  • Joint end-to-end recognition deep network and data augmentation for industrial mould number recognition   Order a copy of this article
    by RuiMing Li, ChaoJun Dong, JiaCong Chen, YiKui Zhai 
    Abstract: With the booming manufacturing industry, the significance of mould management is increasing. At present, manual management is gradually eliminated owing to need for a large amount of labour, while the effect of a radiofrequency identification (RFID) system is not ideal, which is limited by the characteristics of the metal, such as rust and erosion. Fortunately, the rise of convolutional neural networks (CNNs) brings down to the solution of mould management from the perspective of images that management by identifying the digital number on the mould. Yet there is no trace of a public database for mould recognition, and there is no special recognition method in this field. To address this problem, this paper first presents a novel data set aiming to support the CNN training. The images in the database are collected in the real scene and finely manually labelled, which can train an effective recognition model and generalise to the actual scenario. Besides, we combined the mainstream text spotter and the data augmentation specifically designed for the real world, and found that it has a considerable effect on mould recognition.
    Keywords: mould recognition database; text spotter; mould recognition; data augmentation.

  • An SMIM algorithm for reduction of energy consumption of virtual machines in a cluster   Order a copy of this article
    by Dilawaer Duolikun, Tomoya Enokido, Makoto Takizawa 
    Abstract: Applications can take advantage of virtual computation services independently of heterogeneity and locations of servers by using virtual machines in clusters. Here, a virtual machine on an energy-efficient host server has to be selected to perform an application process. In this paper, we newly propose an SMI (Simple Monotonically Increasing) estimation algorithm to estimate the energy consumption of a server to perform application processes and the total execution time of processes on a server. We also propose an SMIM (SMI Migration) algorithm to make a virtual machine migrate from a host server to a guest server to reduce the total energy consumption of the servers by estimating the energy consumption in the SMI algorithm. In the evaluation, we show the energy consumption of servers in a cluster can be reduced in the SMIM algorithm compared with other algorithms.
    Keywords: server selection algorithm; migration of virtual machines; green computing systems; SMI algorithm; SMIM algorithm.

  • FIAC: fine-grained access control mechanism for cloud-based IoT framework   Order a copy of this article
    by Bhagwat Prasad Chaudhury, Kasturi Dhal, Srikant Patnaik, Ajit Kumar Nayak 
    Abstract: Cloud computing technology provides various computing resources on demand to the user on pay per use basis. The users use the services without the need for establishment and maintenance costs. The technology fails in terms of its usage owing to confidentiality and privacy issues. Access control mechanisms are the tools to prevent unauthorised access to remotely stored data. CipherText Policy Attribute-based Encryption (CPABE) is a widely used tool for facilitating authorised users to access the remotely stored encrypted data with fine-grained access control. In the proposed model FIAC (Fine-Grained Access Control Mechanism for cloud-based IoT Framework ), the access control mechanism is embedded in the cloud-based application to measure and generate a report on the air quality in a city. The major contribution of this work is the design of three algorithms all of which are attribute based: key generation algorithm, encryption and decryption algorithms. Only authorised users can view it to take appropriate action plans. Carbon dioxide concentrations, dust, temperature, and relative humidity are the parameters that we have considered for air quality. To enhance the security of the cloud-based monitoring system, we have embedded a security scheme, all of which are attribute based. Further, the computation time of the model is found to be encouraging so that it can be used in low power devices. The experimental outcomes establish the usability of our model.
    Keywords: air pollution; carbon dioxide concentrations; dust density; internet of things; FIAC; access control.

  • University ranking approach with bibliometrics and augmented social perception data   Order a copy of this article
    by Kittayaporn Chantaranimi, Rattasit Sukhahuta, Juggapong Natwichai 
    Abstract: Typically, universities aim to achieve a high position in ranking systems for their reputation. However, self-evaluating rankings could be costly because the indicators are not only from bibliometrics, but also the results of over a thousand surveys. In this paper, we propose a novel approach to estimate university rankings based on traditional data, i.e., bibliometrics, and non-traditional data, i.e., Altmetric Attention Score, and Sustainable Development Goals indicators. Our approach estimates subject-areas rankings in Arts & Humanities, Engineering & Technology, Life Sciences & Medicine, Natural Sciences, and Social Sciences & Management. Then, by using Spearman rank-order correlation and overlapping rate, our results are evaluated by comparing with the QS subject ranking. From the result, our approach, particularly the top-10 ranking, performed estimating effectively and then could assist stakeholders in estimating the university's position when the survey is not available.
    Keywords: university ranking; rank similarity; bibliometrics; augmented social perception data; sustainable development goals; Altmetrics.

  • Performance comparison of various machine learning classifiers using a fusion of LBP, intensity and GLCM feature extraction techniques for thyroid nodules classification   Order a copy of this article
    by Rajshree Srivastava, Pardeep Kumar 
    Abstract: Machine learning (ML) and feature extraction techniques have shown a great potential in medical imaging field. This work presents an effective approach for the identification and classification of thyroid nodule. In the proposed model, various features are extracted using gray level co-occurrence matrix (GLCM), local binary pattern (LBP) and intensity-based matrix. These features are fed to various ML classifiers like k-nearest neighbour (KNN), decision-tree (DT), artificial neural network (ANN), naive Bayes, extreme gradient boosting (XGBoost), random forest (RF), linear regression (LR) and support vector machine (SVM). From the result analysis, it can be observed that proposed Model-4 has performed better in comparison with the rest of seven proposed models with the reported literature. An improvement of 4% to 5% is seen in performance evaluation of model in comparison with reported literature.
    Keywords: machine learning; LBP; GLCM; intensity; noise removal; feature extraction.
    DOI: 10.1504/IJGUC.2023.10055313
  • Strongly correlated high utility item-sets recommendation in e-commerce applications using EGUI-tree over data stream   Order a copy of this article
    by P. Amaranatha Reddy, M. H. M. Krishna Prasad, S. Rao Chintalapudi 
    Abstract: The product recommendation feature helps the customers to select the right items in e-commerce applications. Grounded on a similar kind of customer purchase history previously, it recommends the items to the customers. The customers may or may not choose from the recommended items list. Suppose the customers like and purchase them, it is added advantage to both the buyers and the sellers. In sellers' sales, they get the benefit of an increase; in addition, the buyers too can save their time to search. Thus, a recommendation system must be designed so that the suggested items have High Utility (HU) and strong correlation to meet such requirements. High profit is exhibited by the HU items and the strongly correlated items have more probability of selection. As these measurements have significant roles in the business process, both of those measurements of items needed to be taken care. Here, since up-to-date data in the data stream are got, the stream of purchase transactions is mined using the sliding window technique to extract such item-sets.
    Keywords: high utility item-set mining; recommendation system; utility; correlation; data stream; sliding window.
    DOI: 10.1504/IJGUC.2023.10057772
  • A survey on deduplication systems   Order a copy of this article
    by Amdewar Godavari, Chapram Sudhakar 
    Abstract: With the arrival of new technological trends such as big data and the internet of things, a tremendous amount of duplicate data is being generated. Duplicate data causes the wastage of storage capacity and degradation of performance of the storage systems. Data deduplication is a storage optimisation technique that is used to eliminate duplicate data. Deploying a deduplication system for primary storage or secondary storage is challenging owing to extra latency incurred in deduplication processing. Apart from this, as duplicates are eliminated, deduplication affects contiguous placement of data on the disk, which is known as disk fragmentation problem. This paper gives an overview of issues and solutions proposed for deploying deduplication component for primary and/or secondary storage systems with centralised or distributed approaches. Experiments are conducted using Destor tool on different data sets. The results are used to study the effect of different chunking algorithms on deduplication phases.
    Keywords: data fragmentation; deduplication; disk bottleneck.

  • Modified tabu-based ant colony optimisation algorithm for energy-efficient cloud computing systems   Order a copy of this article
    by Jyoti Chauhan, Taj Alam 
    Abstract: Cloud computing (CC) has gained huge superiority in recent era for hosting and providing services via the internet. However, the widespread adoption of cloud computing and a rapid rise in capacity and scale of data centres results in a significant increase in electricity usage, rising data centre ownership costs, and increased carbon footprints. One of the challenging research problems of this point in time is to reduce the energy consumption in cloud data centres which leads motivation of green cloud computing. The total number of resources allocated to a task directly relates to power usage of virtual machines and datacentres. Therefore, by demotivating the number of resources allocated to perform the task, the power consumption can be decreased. Hence, well known Task scheduling considered as NP-hard problem needs to be addressed to facilitate green CC which influences the overall efficiency of cloud system. This paper presented a novel energy efficient task scheduling approach for the heterogeneous cloud environment; Modified tabu-based Ant Colony Optimization Algorithm (TACO) which is applied to minimise energy consumption and reduced the makespan and maximised average resource usage. The proposed TACO approach maintains a Long Term Memory (LTM) in terms of aspiration list along with Short term memory (STM) in terms of tabu list to perform scheduling decision and is evaluated in CloudSim simulation environment to measure its performance with existing metaheuristics energy based Particle Swarm Optimisation (PSO) and energy based Genetic Algorithm (GA) algorithm. The simulation result clearly shows that our proposed TACO algorithm is outperforming in terms of energy consumption compared to considered approaches for our objective function.
    Keywords: ant colony optimisation; cloud computing; scheduling; particle swarm optimisation; genetic algorithm; energy efficiency; green computing; tabu.

  • Performance evaluation using throughput and latency of a blockchain-enabled patient centric secure and privacy preserve EHR-based on IPFS   Order a copy of this article
    by Vishal Sharma, Niranjan Lal, Anand Sharma 
    Abstract: Every nation needs a better healthcare system and services for general people for digital medical records, which are available on a large scale. However, patients’ health data is too sensitive to share and unsecured to store on centralized storage. However, it is required to ensure security and privacy with better storage and retrieval methods for PEHR (Patient Electronic Health Record). Blockchain allows for the secure and effective exchange of PEHR in a decentralized, tamper-proof manner and traceable distributed ledger that stores using Hyperledger Fabric (HLF) framework in encrypted form on the InterPlanetary File System (IPFS). The Hyperledger caliper benchmark measures the blockchain network’s performance concerning transaction throughput and latency. This paper discusses the performance evaluation of a Blockchain-Enabled Patient Centric Secure (BEPCS) and Privacy preserved electronic health record on IPFS. It proposes a strategy that may increase throughput by 5-10% and decrease latency by 5-10% with better security and Privacy.
    Keywords: medical data security; patient electronic healthcare records; consortium blockchain; InterPlanetary file system; medical data privacy preservation; chaincode; patient-directed healthcare system.
    DOI: 10.1504/IJGUC.2023.10059539
  • Target imaging technology of wireless orbital communication radar Target imaging technology of wireless orbital communication radar   Order a copy of this article
    by Xin Tan, Chaoqi Wang, Mingwei Wang, Wenyuan Liu, Xianghui Wang 
    Abstract: Radar observes the target by transmitting stepped frequency signals or chirp signals. After receiving the target echo, it performs pulse compression processing in the range, and uses Fourier transform for imaging in the azimuth direction. The process of radar imaging is essentially the inversion process of the electromagnetic scattering distribution of the target in space, so the structure and size information of the target can be obtained from the radar image for subsequent target detection and recognition. With the continuous advancement of technology, non-communication radar imaging has become more and more widely used in the fields of air observation and Earth observation. In this paper, the imaging technology of wireless orbital communication radar is systematically studied, and the deviation of image quality accuracy is compensated by motion autofocus. The results show that the video-based backward projection algorithm proposed in this paper is better than the RD algorithm and the
    Keywords: wireless communication radar; radar imaging technology; target recognition imaging; Sar imaging; Bayesian imaging.
    DOI: 10.1504/IJGUC.2023.10059686
  • Optimisation of the hybrid grey wolf method in cluster-based wireless sensor network using edge computing   Order a copy of this article
    by Ashok Kumar Rai, Rakesh Kumar 
    Abstract: Wireless Sensor Networks (WSNs) cover most of the secure data transfer applications and play a significant role in the IoT for primary data collection, which needs energy-efficient data transfer and improved network lifetime. The major challenge for these protocols is setting up optimum clusters and Cluster Head (CH) formation for efficient operation. Wireless Sensor Network (WSN) has a critical role in parallel computation in which resources can be assigned to the sub-task and equalize the load that improves the network lifetime. This paper uses the Grey Wolf Optimization (GWO) algorithm in the proposed work by observing two variables i.e. Residual Energy (RE) and node distance (DS) from Base Station (BS) that visualized and analysed the GWO under variable parameters in WSN. This approach identifies the most suitable node from all normal nodes for the selection of CH. The outcome demonstrates that using GWO improved the performance of the proposed model
    Keywords: base station; cluster head; energy efficiency; grey wolf optimisation; wireless sensor network.
    DOI: 10.1504/IJGUC.2023.10060014
  • Detection of Crop Disorder using Deep Learning   Order a copy of this article
    by Vinita Chauhan, Suma Dawn 
    Abstract: An estimated 14% of global yield is lost to plant diseases each year, causing suffering to billions of people. Plant pathology studies diseases, microbes, and climate conditions that lead to plant death. Plant survival is the focus of plant pathology. Temperature, pH, humidity, and moisture can cause plant diseases. Chemical misuse, environmental imbalance, and drug resistance can result from misdiagnosis. Diseases can be diagnosed by human scouting. Image analysis of plant leaves can help diagnose diseases automatically. Automated disease detection involves image selection, pre-processing, segmentation, augmented features, and model prediction. Crop diseases can be detected and classified accurately by Deep Convolutional-Networks since a few years ago. Convolutional neural networks are used to detect diseases based on images. Several domains have successfully applied deep neural networks. Disease-crop pairs can be calculated from inputs (such as images of diseased plants) via networks. Developing accurate image classifiers required a large dataset of diseased plant images. Images of healthy and diseased crops are used in the Mendeley database to address this problem. The goal is to detect leaf disease.
    Keywords: deep Learning; crop disease detection; Resnet-50; Mobilenet; DenseNet-121; EfficientnetB0; image processing.
    DOI: 10.1504/IJGUC.2023.10060098
  • An enhanced human learning optimisation algorithm for effective data clustering   Order a copy of this article
    by Jigyasa Goyal, Yugal Kumar, Pardeep Kumar, Arvinder Kaur 
    Abstract: Clustering arranges the data objects into clusters and similar data objects are put into the same cluster. There is not a single algorithm that can work effectively with all types of clustering problem. The quality of clusters is also an important issue for clustering algorithms. This work deals with aforementioned issues and presents a new clustering algorithm based human learning optimisation. Several modifications are incorporated into HLO algorithm for alleviating similar ability of individual learning and control parameter issues, called enhanced HLO (EHLO). The similar learning ability of individuals can be enhanced based on learner phase of TLBO algorithm. The control parameter issue of random learning is resolved through logistic chaotic map. The well-known clustering datasets are chosen for conducting the experiments and results are compared with eight state of art clustering algorithms based on several performance metrics. The results showed that EHLO obtains superior clustering results than other algorithms.
    Keywords: cluster analysis; data clustering; heuristics; human learning optimisation; clusters.
    DOI: 10.1504/IJGUC.2023.10060384
  • Developing software predictive model for examining software bugs using machine learning   Order a copy of this article
    by Swati Singh, Monica Mehrotra, Taran Singh Bharti 
    Abstract: Software faults prediction is an emerging research area in the software engineering. It is an important issue for IT industry and professionals. We need prior information of an application for faults or faulty modules in traditional approach to determine software faults. If we use machine leaching techniques then we can easily automate the models enabling application software to knowingly predict and recover the application software faults. This capability type features helps in developing the application software to execute more productively and minimise faults, cost and time. In the scenario of this research, we are considering the software appropriate models that predicted development models using subsets of artificial intelligence-based approaches. Besides, we utilize noticeable benchmark techniques for evaluation of performance for software predictive models. However, researchers and software exponents can accomplish independent perception from this research and can pick out automated tasks for their deliberated application.
    Keywords: machine learning; software predictive model; software faults.
    DOI: 10.1504/IJGUC.2023.10060445
  • A page weight-based replacement algorithm to enhance the performance of buffer management in flash memory   Order a copy of this article
    by Shweta ., P.K. Singh 
    Abstract: Flash memory is used in various electronic handheld devices such as laptops and PDAs as secondary storage because of its excellent performance, low energy consumption, compact size, high access speed, and resistance to shock with growing density and lowering prices. However, the intrinsic properties, such as no in-place update and asymmetric I/O operations, provide challenges to designing buffer replacement strategies. This paper suggests an improved buffer management strategy called the page weight buffer replacement (PWBR) algorithm for flash memory, which considers buffers’ page weight. An eviction approach is applied which tries to minimise the number of write counts and maintain a higher buffer hit rate by integrating recency, operational cost and temporal locality. Our finding shows PWBR is superior to existing buffers management policies in terms of increasing the hit ratio of LRU-WSR, CF-LRU, CCF-LRU, and ADLRU by 9.3%, 6.4%,3.7%, and 2.5% higher respectively.
    Keywords: buffer replacement algorithm; frequency; recency; page migration.
    DOI: 10.1504/IJGUC.2023.10060590
  • Virtual traditional craft simulation system in mixed reality environment   Order a copy of this article
    by Rihito Fuchigami, Tomoyuki Ishida 
    Abstract: In a previous study, we implemented a high presence virtual traditional craft system using a head mounted display (HMD) and a mobile traditional craft presentation system using augmented reality (AR). However, the high presence virtual traditional craft system had to construct the different cultural architectures in advance, and the amount of work was enormous. Moreover, the mobile traditional craft presentation system lacks a sense of presence because users experience traditional crafts on the mobile terminal's flat display. Therefore, in this study, we developed a mixed reality (MR) virtual traditional craft simulation system using an HMD. With MR technology, we have overcome the work cost and low presence issues relative to the construction of a virtual reality (VR) space. We conducted a comparative evaluation experiment with 30 subjects to evaluate our proposed system. We obtained high evaluations of the system's presence and applicability; however, several operability issues were identified.
    Keywords: mixed reality; augmented reality; interior simulation; Japanese traditional crafts; immersive system.

  • An efficient privacy-preservation algorithm for incremental data publishing   Order a copy of this article
    by Torsak Soontornphand, Mizuho Iwaihara, Juggapong Natwichai 
    Abstract: Data can be continuously collected and grown all the time. Privacy protection designed for static data might not be able to cope with this situation effectively. In this paper, we present an efficient privacy preservation approach based on (k, l)-anonymity for incremental data publishing. We first illustrate the three privacy attacks, i.e., similarity, difference and joint attacks. Then, the three characteristics of incremental data publishing are analysed and exploited to efficiently detect privacy violations. With the studied characteristics, the similarity and join attack detection can be skipped for stable releases. In addition, only a subtype of the similarity attack and the latest previously released data set need to be detected. From experimental results, the proposed method is highly efficient, with an average execution time eleven times less than a compared static algorithm. In addition, the proposed method can also maintain better data quality than the compared methods at every setting.
    Keywords: privacy preservation; incremental data publishing; privacy attack; full-domain generalisation.

  • Cloud workflow scheduling algorithm based on multi-objective particle swarm optimisation   Order a copy of this article
    by Hongfeng Yin, Baomin Xu, Weijing Li 
    Abstract: Owing to the characteristics of market-oriented cloud computing, the objective function of cloud workflow scheduling algorithm should not only consider the running time, but also consider the running costs. The nature of cloud workflow scheduling is to map each task of a workflow instance to appropriate computing resources. Owing to the existence of temporal dependencies and causal dependencies between tasks, the scheduling of cloud workflow instance becomes more complex. The main contribution of this paper is to propose a cloud workflow scheduling algorithm based on multi-objective particle swarm optimisation. The algorithm takes makespan and total cost as two objectives. It provides users with a set of Pareto optimal solutions to select an optimal scheduling scheme according to their own preferences. The performance of our algorithm is compared with state-of-the-art multi-objective meta-heuristics and classical single-objective scheduling algorithm. The simulation results show that our solution delivers better convergence and optimisation capability as compared to others. Hence, it is applicable to solve multi-objective optimisation problems for scheduling workflows over cloud platform.
    Keywords: multi-objective optimisation; cloud computing; particle swarm optimisation; workflow scheduling.

  • Intelligent system design of urban street landscape device based on improved D*Lite algorithm   Order a copy of this article
    by Qian Li 
    Abstract: The urban street landscape is a beautiful landscape in the city, which has its own nature and sociality, and is also a complex spatial system. The design of urban street landscape simply depends on the designer's experience, feeling and understanding and lacks a kind of science. The urban street landscape is a systematic, scientific and overall design idea, which guides the organisation and expression of design elements to achieve multi-level design purposes. In order to help solve the problem of objectivity such as single street design, this paper mainly analyses the elements, functions, principles, system strategies and methods of the street landscape system. Through the comparison between the improved D*Lite algorithm and D*Lite algorithm in the street landscape design path and people's satisfaction, the results showed that the street landscape path under the improved D*Lite algorithm was relatively smooth; the number of extremely dissatisfied people has decreased by 20; the number of very satisfied people has increased by 15. It also showed that the improved D*Lite algorithm can be more perfect in the design of urban street landscape.
    Keywords: intelligent system design; urban street landscape; improved D*Lite algorithm; design strategy; street landscape design.
    DOI: 10.1504/IJGUC.2023.10059754
  • INFRDET: IoT network flow regulariser-based detection and classification of IoT botnet   Order a copy of this article
    by Umang Garg, Santosh Kumar, Manoj Kumar 
    Abstract: Internet of Things (IoT) botnet is one of the attacks which affect the working of authentic IoT devices. In this paper, a novel light-weighted intelligent system has been devised by using traffic analysis and regulators to detect botnet-infected devices in the IoT network. The system operates on a low-powered Raspberry Pi device with network packet counts. Besides, an IoT Network Flow Regulariser (INFR) algorithm is proposed and embedded for transforming network flows to the uniform length traffic frame. The experimental results show the better performance of the proposed system with the INFR algorithm in comparison to the existing work. In addition, to classify the benign and malicious traffic, a novel method is used to visualise the network activities through graphical heatmaps. These heatmaps are further investigated using a hybrid Convolution Neural Network (CNN) model without and with the INFR algorithm and therefore receive remarkable differences in terms of better results.
    Keywords: IoT botnet; deep learning; CNN; DDoS; VGG.
    DOI: 10.1504/IJGUC.2023.10059519
  • Intrusion detection and prevention with machine learning algorithms   Order a copy of this article
    by Victor Chang, Sreeja Boddu, Qianwen Ariel Xu, Le Minh Thao Doan 
    Abstract: In recent decades, computer networks have played a key role in modern life and also have escalated the number of new attacks on internet traffics to avoid malicious activities. An Intrusion Detection System (IDS) is imperative for researching firewalls, anti-viruses and intrusion (bad connection). Many researchers are striving to overcome the challenges of IDS and focus on getting better accuracy to predict automatically normal data connection and abnormal data. To resolve the above problems, many researchers are focused on traditional machine learning and deep learning algorithms to detect automatically internal and external connections of network protocol. This paper adopts various Machine Learning (ML) techniques such as Bayes Network, Random Forest, Decision Table and Nearest Neighbour. The data set KDDcup-1999, which is the most reliable data set, contains a wide range of network environments. A framework for to catch attacks is also proposed with a detection rate of more than 98%. It suggested the potential application of this framework in practice to detect intrusion and contribute to the cybersecurity field.
    Keywords: machine learning; deep learning; security data set; intrusion detection.

  • Privacy-aware trajectory data publishing: an optimal efficient generalisation algorithm   Order a copy of this article
    by Nattapon Harnsamut, Juggapong Natwichai 
    Abstract: With the increasing location-aware technologies that provide positioning services, it is easy to collect users' trajectory data. Such data could often contain sensitive information, i.e., private locations, private trajectory or path, and other sensitive attributes. In this paper, we propose a privacy preservation algorithm based on LKC-privacy model to protect the privacy attacks of trajectory data publishing. Not only can the data utility be maintained effectively by the data generalisation approach, but this algorithm can also reduce the computing time of the look-up table creation which is one of the highest computational processes. Our proposed algorithm is evaluated with extensive experiments. From the experiments, the enhanced-LT algorithm performance in terms of the execution time outperforms the baseline LT algorithm by 48.5% on average. The results show that our proposed algorithm returns not only the optimal solution but also highly efficient computation times.
    Keywords: LKC-privacy; trajectory data publishing; optimal algorithm; generalisation technique.

  • Deer-based chicken swarm optimisation algorithm: a hybrid optimisation algorithm for output domain testing   Order a copy of this article
    by Ramgouda Patil, V. Chandraprakash 
    Abstract: An effective combinatorial test case generation framework is devised in this research using a hybrid optimisation algorithm, Deer-based Chicken Swarm Optimisation (DCSO) for generating test cases for output domain testing of embedded systems. The developed DCSO is the amalgamation of the Deer Hunting Optimisation Algorithm (DHOA) and Chicken Swarm Optimisation (CSO) algorithm. The three different worst-case resource usage scenarios are considered in this research as an objective. The worst-case resource usage scenario includes Worst-Case Execution Times (WCET), Worst-Case Suite Size (WCSS) and Worst-Case Stack Usage (WCSU). The developed DCSO algorithm is used for output domain testing of embedded systems by generating combinatorial test cases. Metrics, such as fitness and test suite size, are used for analysing the developed DCSO algorithm. The proposed DCSO algorithm obtained a minimum test suite size of 84 and minimum fitness of 3.66, respectively, on comparing with the existing test case generation methods.
    Keywords: deer CSO; combinatorial testing; test case generation; embedded systems; output domain testing.
    DOI: 10.1504/IJGUC.2023.10060037

Special Issue on: CONIITI 2019 Intelligent Software and Technological Convergence

  • Computational intelligence system applied to plastic microparts manufacturing process   Order a copy of this article
    by Andrés Felipe Rojas Rojas, Miryam Liliana Chaves Acero, Antonio Vizan Idoipe 
    Abstract: In the search for knowledge and technological development, there has been an increase in new analysis and processing techniques closer to human reasoning. With the growth of computational systems, hardware production needs have also increased. Parts with millimetric to micrometric characteristics are required for optimal system performance, so the demand for injection moulding is also increasing. Injection moulding process in a complex manufacturing process because mathematical modelling is not yet established: therefore, to address the selection of correct values of injection variables, computational intelligence can be the solution. This article presents the development of a computational intelligence system integrating fuzzy logic and neural network techniques with CAE modelling system to support injection machine operators, in the selection of optimal machine process parameters to produce good quality microparts using fewer processes. The tests carried out with this computational intelligent system have shown a 30% improvement in the efficiency of the injection process cycles.
    Keywords: computational intelligence; neural networks; fuzzy logic; micro-parts; plastic parts; computer vision; expert systems; injection processes; CAD; computer-aided design systems; CAE; computer-aided engineering.

Special Issue on: ICIMMI 2019 Emerging Trends in Multimedia Processing and Analytics

  • An optimal channel state information feedback design for improving the spectral efficiency of device-to-device communication   Order a copy of this article
    by Prabakar Dakshinamoorthy, Saminadan Vaitilingam 
    Abstract: This article introduces a regularised zero-forcing (RZF) based channel state information (CSI) feedback design for improving the spectral efficiency of device-to-device (D2D) communication. This proposed method exploits conventional feedback design along with the optimised CSI in regulating the communication flows in the communicating environment. The codebook-dependent precoder design improves the rate of feedback by streamlining time/frequency dependent scheduling. The incoming communication traffic is scheduled across the available channels by pre-estimating their adaptability and capacity across the underlying network. This helps to exchange partial channel information between the communicating devices without the help of base station services. These features reduce the transmission error rates to achieve better sum rate irrespective of the distance and transmit power of the devices.
    Keywords: CSI; D2D; feedback design; precoding; zero-forcing.

Special Issue on: Green Network Communication for Sustainable Smart Grids Current Uses and Future Applications

  • Performance enhancement of MIMO-OFDM using hybrid equalisers-based ICI mitigation with channel estimation in time varying channels   Order a copy of this article
    by Madhavi Latha Pandala, Samanthapudi Swathi, Abdul Hussain Sharief, Suresh Penchala, Ganga Rama Koteswara Rao, Pala Mahesh Kumar 
    Abstract: High spectrum efficiency and resistance to interferences make Multiple-Input-Multiple-Output Orthogonal Frequency Division Multiplexing (MIMO-OFDM) an exceptionally good choice for the realisation of Long-Term Evaluation-Advanced (LTE-A). Thus, here in this article, both time variant training and time invariant Channel Estimation (CE) are suggested in MIMO-OFDM. Further, combination of three equalisation techniques named as ZF-MMSE-SIC is presented for improving the possessions of BER and ICI in MIMO-OFDM system, where zero forcing (ZF) is utilised for time variant CE, Minimum Mean Square Error (MMSE) is employed for time invariant CE schemes. Successive Interference Cancellation (SIC) is utilised to reduce the ICI further to minimum level by enhancing the Carrier-to-Interference Ratio (CIR). Extensive simulations revealed that proposed hybrid methodology performed better than conventional ICI mitigation algorithms over different time varying channels while improving the performance of both BER and spectral efficiency.
    Keywords: MIMO-OFDM; ICI cancellation; zero-forcing; minimum mean square error; carrier-to-interference ratio; successive interference cancelation.

Special Issue on: AMLDA 2022 Applied Machine Learning and Data Analytics Applications, Challenges, and Future Directions

  • Fuzzy forests for feature selection in high-dimensional survey data: an application to the 2020 US Presidential Election   Order a copy of this article
    by Sreemanti Dey, R. Michael Alvarez 
    Abstract: An increasingly common methodological issue in the field of social science is high-dimensional and highly correlated datasets that are unamenable to the traditional deductive framework of study. Analysis of candidate choice in the 2020 Presidential Election is one area in which this issue presents itself: in order to test the many theories explaining the outcome of the election, it is necessary to use data such as the 2020 Cooperative Election Study Common Content, with hundreds of highly correlated features. We present the fuzzy forests algorithm, a variant of the popular random forests ensemble method, as an efficient way to reduce the feature space in such cases with minimal bias, while also maintaining predictive performance on par with common algorithms such as random forests and logit. Using fuzzy forests, we isolate the top correlates of candidate choice and find that partisan polarisation was the strongest factor driving the 2020 Presidential Election.
    Keywords: fuzzy forests; machine learning; ensemble methods; dimensionality reduction; American elections; candidate choice; correlation; partisanship; issue voting; Trump; Biden.

  • An efficient intrusion detection system using unsupervised learning AutoEncoder   Order a copy of this article
    by N.D. Patel, B.M. Mehtre, Rajeev Wankar 
    Abstract: As attacks on the network environment are rapidly becoming more sophisticated and intelligent in recent years, the limitations of the existing signature-based intrusion detection system are becoming more evident. For new attacks such as Advanced Persistent Threat (APT), the signature pattern has a problem of poor generalisation performance. Research on intrusion detection systems based on machine learning is being actively conducted to solve this problem. However, the attack sample is collected less than the normal sample in the actual network environment, so it suffers a class imbalance problem. When a supervised learning-based anomaly detection model is trained with these data, the results are biased toward normal samples. In this paper, AutoEncoder (AE) is used to perform single-class anomaly detection to solve this imbalance problem. The experimental evaluation was conducted using the CIC-IDS2017 dataset, and the performance of the proposed method was compared with supervised models to evaluate the performance
    Keywords: intrusion detection system; advanced persistent threat; CICIDS2017; AutoEncoder; machine learning; data analytics.

  • Optimal attack detection using an enhanced machine learning algorithm   Order a copy of this article
    by Reddy Saisindhutheja, Gopal K. Shyam, Shanthi Makka 
    Abstract: As computer network and internet technologies advance, the importance of network security is widely acknowledged. Network security continues to be a substantial challenge within the cyberspace network. The Software-as-a-Service (SaaS) layer describes cloud applications where users can connect using web protocols such as hypertext transfer protocol over transport Layer Security. Worms, SPAM, Denial-of-service (DoS) attacks or botnets occur frequently in the networks. In this light, this research intends to introduce a new security platform for SaaS framework, which comprises two major phases: 1) optimal feature selection and (2) classification. Initially, the optimal features are selected from the dataset. Each dataset includes additional features, which further leads to complexity. A novel algorithm named Accelerator updated Rider Optimisation Algorithm (AR-ROA), a modified form of ROA and Deep Belief Network (DBN) based attack detection system is proposed. The optimal features that are selected from AR-ROA are subjected to DBN classification process, in which the presence of attacks is determined. The proposed work is compared against other existing models using a benchmark dataset, and improved results are obtained. The proposed model outperforms other traditional models, in aspects of accuracy (95.3%), specificity (98%), sensitivity (86%), precision (92%), negative predictive value (97%), F1-score (86%), false positive ratio (2%), false negative ratio (10%), false detection ratio (10%), and Matthews correlation coefficient (0.82%).
    Keywords: software-as-a-service framework; security; ROA; optimisation; DBN; attack detection system.

Special Issue on: Cloud and Fog Computing for Corporate Entrepreneurship in the Digital Era

  • Enhanced speculative approach for big data processing using BM-LOA algorithm in cloud environment   Order a copy of this article
    by Hetal A. Joshiara, Chirag S. Thaker, Sanjay M. Shah, Darshan B. Choksi 
    Abstract: In the event that one of the several tasks is being allocated to an undependable or jam-packed machine, a hugely parallel processing job can be delayed considerably. Hence, the majority of the parallel processing methodologies, namely (MR), have espoused diverse strategies to conquer the issue called the straggler problem. Here, the scheme may speculatively introduce extra copies of a similar task if its development is unnaturally slow when an additional idling resource is present. In the strategies-centred processes, the dead node is exhibited. the (RT) along with backup time of the slow task is assessed. The slow task is rerun with the aid of BM-LOA subsequent to the evaluation. In both heterogeneous and homogeneous environments, the proposed approach is performed. Centred on the performance metrics, the proposed research techniques performance is scrutinised in experimental investigation. Thus, when weighed against the other approaches, the proposed technique achieves superior performance.
    Keywords: modified exponentially weighted moving average; speculative execution strategy; Hadoop supreme rate performance; big data processing; rerun.

  • Importance of big data for analysing models in social networks and improving business   Order a copy of this article
    by Zhenbo Zang, Honglei Zhang, Hongjun Zhu 
    Abstract: The digital revolution is the build-up of many digital advances, such as the network phenomena transformation. To increase business productiveness and performance, participatory websites that allow active user involvement and the intellectual collection became widely recognized as a value-added tool organization of all sizes. However, it has lacked an emphasis on business and management literature to promote profitable business-to-business (b2b). User knowledge is a way to consider the kinds of search ions that big data can use. The use of social media data for analysing business networks is a promising field of research. Technological advances have made the storage and analysis of large volumes of data commercially feasible. Big information represents many sources of organised, half-structured, and unstructured data in real-time. Big data is a recent phenomenon in the computing field, which involves technological or market study. The management and research of a vast network provide all companies with significant advantages and challenges. The volume of information floating on social networks grows every day and, if processed correctly, provides a rich pool of evidence.
    Keywords: big data; business-to-business; data analytics; text mining; social networks.