Forthcoming and Online First Articles

International Journal of Grid and Utility Computing

International Journal of Grid and Utility Computing (IJGUC)

Forthcoming articles have been peer-reviewed and accepted for publication but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.

Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.

Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.

Online First articles are published online here, before they appear in a journal issue. Online First articles are fully citeable, complete with a DOI. They can be cited, read, and downloaded. Online First articles are published as Open Access (OA) articles to make the latest research available as early as possible.

Open AccessArticles marked with this Open Access icon are Online First articles. They are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.

Register for our alerting service, which notifies you by email when new issues are published online.

We also offer which provide timely updates of tables of contents, newly published articles and calls for papers.

International Journal of Grid and Utility Computing (47 papers in press)

Regular Issues

  • Research on modelling analysis and maximum power point tracking strategies for distributed photovoltaic power generation systems based on adaptive control technology   Order a copy of this article
    by Yan Geng, Jianwei Ji, Bo Hu, Yingjun Ju 
    Abstract: As is well-known, the distributed photovoltaic power generation technology has been rapidly developed in recent years. The cost of distributed photovoltaic power generation is much higher than that of traditional power generation modes. Therefore, how to improve the effective use of photovoltaic cells has become a popular research direction. Based on the analysis of the characteristics of photovoltaic cells, this paper presents a mathematical model of photovoltaic cells and a maximum power point tracking algorithm based on hysteresis control and adaptive control technology variable step perturbation observation method. This algorithm can balance the control precision and control speed from the disturbance observation method and improve the tracking results significantly. Finally, the feasibility of the algorithm and the tracking effects are simulated by using Matlab/Simulink software.
    Keywords: distributed photovoltaic; adaptive control technology; maximum power point tracking strategies.

  • SDSAM: a service-oriented approach for descriptive statistical analysis of multidimensional spatio-temporal big data   Order a copy of this article
    by Weilong Ding, Zhuofeng Zhao, Jie Zhou, Han Li 
    Abstract: With the expansion of the Internet of Things, spatio-temporal data has been widely used and generated. The rise of big data in space and time has led to a flood of new applications with statistical analysis characteristics. In addition, applications based on statistical analysis of these data must deal with the large capacity, diversity and frequent changes of data, as well as the query, integration and visualisation of data. Developing such applications is essentially a challenging and time-consuming task. In order to simplify the statistical analysis of spatio-temporal data, a service-oriented method is proposed in this paper. This method defines the model of spatio-temporal data service and functional service. It defines a process-based application of spatio-temporal big data statistics to invoke basic data services and functional services, and proposes an implementation method of spatio-temporal data service and functional service based on Hadoop environment. Taking the highway big data analysis as an example, the validity and applicability of this method are verified. The effectiveness of this method is verified by an example. The validity and applicability of the method are verified by a case study of Expressway large data analysis. An example is given to verify the validity of the method.
    Keywords: spatio-temporal data; RESTful; web service.

  • Research on integrated energy system planning method considering wind power uncertainty   Order a copy of this article
    by Yong Wang, Yongqiang Mu, Jingbo Liu, Yongji Tong, Hongbo Zhu, Mingfeng Chen, Peng Liang 
    Abstract: With the development of energy technology, the planning and operation of integrated energy systems coupled with electricity-gas-heat energy has become an important research topic in the future energy field. In order to solve the influence of wind power uncertainty on the unified planning of integrated energy systems, this paper constructs a wind energy uncertainty quantitative model based on intuitionistic fuzzy sets. Based on this, an integrated energy system planning model with optimal economic cost and environmental cost is established. The model is solved by the harmonic search algorithm. Finally, the proposed method is validated by simulation examples. The effectiveness of the integrated energy system planning method can improve the grid capacity of the wind power and reduce the CO2 of the system. And it has guiding significance for the long-term planning of integrated energy systems
    Keywords: wind power uncertainty; planning method; electricity-gas-heat energy.

  • A privacy-aware and fair self-exchanging self-trading scheme for IoT data based on smart contract   Order a copy of this article
    by Yuling Chen, Hongyan Yin, Yaocheng Zhang, Wei Ren, Yi Ren 
    Abstract: With the development of the era of big data, the demand for data sharing and usage is increasing, especially in the era of Internet of things, thus putting forward a keen demand for data exchanging and data trading. However, the existing data exchanging and trading platforms are usually centralized and usersrnhave to trust platforms. This paper proposes a secure and fair exchanging and trading protocol based on blockchain and smart contract, especially, self-governance without relying centralized trust. By using the protocol, it can guarantee fairness to defend against trade cheating, and security for data confidentiality. It can also guarantee efficiency by transferring data links instead of data between data owners and data buyers. The extensive analysisrnjustified that the proposed scheme can facilitate the self-exchanging and self-trading for big data in a secure, fair and efficient manner.
    Keywords: big data; IoT; fair exchanging; blockchain; smart contract; oblivious protocol; fair trading.

  • Micro-PaaS fog: container based orchestration for IoT applications using SBC   Order a copy of this article
    by Walter D.O. Santo, Rubens De Souza Matos Júnior, Admilson De Ribamar Lima Ribeiro, Danilo Souza Silva, Reneilson Yves Carvalho Santos 
    Abstract: The Internet of Things (IoT) is an emerging technology paradigm in which ubiquitous sensors monitor physical infrastructures, environments, and people in real-time to help in decision making and improve the efficiency and reliability of the systems, adding comfort and life quality to society. In this sense, there are questions concerning the limitation of computational resources, high latency and different QoS requirements related to IoT that move cloud technologies to the fog computing direction, and the adoption of light virtualised solutions, as technologies based in containers to attend to many needs of different domains. This work, therefore, has as its goal to propose and implement a micro-Paas architecture for fog computing, in a cluster of single-board computers (SBC), for orchestration of applications using containers, applied to IoT and that attend to the QoS criteria, e.g. high availability, scalability, load balance, and latency. From this proposed model, the micro-Paas fog was implemented with virtualisation technology in containers using orchestration services in a cluster built with Raspberry Pi to monitor water and energy consumption at a total cost of property equivalent to 23% of a public platform as a service (PaaS).
    Keywords: fog computing; cluster; orchestration; containers; single board computing.

  • Anomaly detection against mimicry attacks based on time decay modelling   Order a copy of this article
    by Akinori Muramatsu, Masayoshi Aritsugi 
    Abstract: Because cyberattackers attempt to cheat anomaly detection systems, it is required to make an anomaly detection system robust against such attempts. We focus on mimicry attacks and propose a system to detect such attacks in this paper. Mimicry attacks make use of ordinary operations in order not to be detected. We take account of time decay in modelling operations to give lower priorities to preceding operations, thereby enabling us to detect mimicry attacks. We empirically evaluate our proposal with varying time decay rates to demonstrate that our proposal can detect mimicry attacks that could not be detected by a state-of-the-art anomaly detection approach.
    Keywords: anomaly detection; mimicry attacks; time decay modelling; stream processing.

  • A cloud-based spatiotemporal data warehouse approach   Order a copy of this article
    by Georgia Garani, Nunziato Cassavia, Ilias Savvas 
    Abstract: The arrival of the big data era introduces new necessities for accommodating data access and analysis by organisations. The evolution of data is three-fold, increase in volume, variety, and complexity. The majority of data nowadays is generated in the cloud. Cloud data warehouses enhance the benefits of the cloud by facilitating the integration of cloud data in the cloud. A data warehouse is developed in this paper, which supports both spatial and temporal dimensions. The research focuses on proposing a general design for spatiobitemporal objects implemented by nested dimension tables using the starnest schema approach. Experimental results reflect that the parallel processing of such data in the cloud can process OLAP queries efficiently. Furthermore, increasing the number of computational nodes significantly reduces the time of query execution. The feasibility, scalability, and utility of the proposed technique for querying spatiotemporal data is demonstrated.
    Keywords: cloud computing; big data; hive; business intelligence; data warehouses; cloud based data warehouses; spatiotemporal data; spatiotemporal objects; starnest schema; OLAP; online analytical processing.

  • Recommendation system based on space-time user similarity
    by Wei Luo, Zhihao Peng, Ansheng Deng 
    Abstract: With the advent of 5G, the way people get information and the means of information transmission have become more and more important. As the main platform of information transmission, social media not only brings convenience to people's lives, but also generates huge amounts of redundant information because of the speed of information updating. In order to meet the personalised needs of users and enable users to find interesting information in a large volume of data, recommendation systems emerged as the times require. Recommendation systems, as an important tool to help users to filter internet information, play an extremely important role in both academia and industry. The traditional recommendation system assumes that all users are independent. In this paper, in order to improve the prediction accuracy, a recommendation system based on space-time user similarity is proposed. The experimental results on Sina Weibo dataset show that, compared with the traditional collaborative filtering recommendation system based on user similarity, the proposed method has better performance in precision, recall and F-measure evaluation value.
    Keywords: time-based user similarity; space-based user similarity; recommendation system; user preference; collaborative filtering.

  • Design and analysis of novel hybrid load-balancing algorithm for cloud data centres   Order a copy of this article
    by Ajay Dubey, Vimal Mishra 
    Abstract: In recent the pandemic scenario there is a paradigm shift, from traditional computing to internet-based computing. Now is the time to store and compute the data in the cloud environment. The Cloud Service Providers (CSPs) establish and maintain a huge shared pool of computing resources that provide scalable and on-demand services around the clock without geographical restrictions. The cloud customers are able to access the services and pay according to the accession of resources. When millions of users across the globe connect to the cloud for their storage and computational needs, there might be issues such as delay in services. This problem is associated with load balancing in cloud computing. Hence, there is a need to develop effective load-balancing algorithms. The Novel Hybrid Load Balancing (NHLB) algorithm proposed in this paper manages the load of the virtual machine in the data centre. This paper is focused on certain problems such as optimisation of performance, maximum throughput, minimisation of makespan, and efficient resource use in load balancing. The NHLB algorithm is more efficient than conventional load-balancing algorithms with reduced completion time (makespan) and response time. This algorithm equally distributes the tasks among the virtual machines on the basis of the current state of the virtual machines and the task time required. The paper compares the result of proposed NHLB algorithm with dynamic load-balancing algorithm and honeybee algorithm. The result shows that the proposed algorithm is better than the dynamic and honeybee algorithms.
    Keywords: cloud computing; data centre; load balancing; virtual machine; makespan; performance optimisation.

  • Cloud infrastructure planning considering the impact of maintenance and self-healing routines over cost and dependability attributes   Order a copy of this article
    by Carlos Melo, Jean Araujo, Jamilson Dantas, Paulo Pereira, Felipe Oliveira, Paulo Maciel 
    Abstract: Cloud computing is the main trend regarding internet service provision. This paradigm, which emerged from distributed computing, gains more adherents every day. For those who provide or aim at providing a service or a private infrastructure, much has to be done, costs related to acquisition and implementation are common, and an alternative to reduce expenses is to outsource maintenance of resources. Outsourcing tends to be a better choice for those who provide small infrastructures than to pay some employees monthly to keep the service life cycle. This paper evaluates infrastructure reliability and the impact of outsourced maintenance over the availability of private infrastructures. Our baseline environments focus on blockchain as a service; however, by modelling both service and maintenance routines, this study can be applied to most cloud services. The maintenance routines evaluated by this paper encompass a set of service level agreements and some particularities related to reactive, preventive, and self-healing methods. The goal is to point out which one has the best cost-benefit for those with small infrastructures, but that still plans to provide services over the internet. Preventive and self-healing repair routines provided a better cost-benefit solution than traditional reactive maintenance routines, but this scenario may change according to the number of available resources that the service provider has.
    Keywords: maintenance; reliability; availability; modelling; cloud Ccmputing; blockchain; container; services; SLA.

  • Edge computing and its boundaries to IoT and Industry 4.0: a systematic mapping study   Order a copy of this article
    by Matheus Silva, Vinícius Meyer, Cesar De Rose 
    Abstract: In the last decade, cloud computing transformed the IT industry, allowing companies to execute many services that require on-demand availability of computational resources with more flexible provisioning and cost models, including the processing of already growing data volumes. But in the past few years, other technologies such as internet of things and the digitised industry known as Industry 4.0 have emerged, increasing data generation even more. The large amounts of data produced by user-devices and manufacturing machinery have made both industry and academia search for new approaches to process all this data. Alternatives to the cloud centralised processing model and its inherent high latencies have been studied, and edge computing is being proposed as a solution to these problems. This study presents a preliminary mapping of the edge computing field, focusing on its boundaries to the internet of things and Industry 4.0. We began with 219 studies from different academic databases, and after the classification process, we mapped 90 of them in eight distinct edge computing sub-areas and nine categories based on their main contributions. We present an overview of the studies on the edge computing area, which evidences the main concentration sub-areas. Furthermore, this study intends to clarify the remaining research gaps and the main challenges faced by this field, considering the internet of things and Industry 4.0 demands.
    Keywords: edge computing; internet of things; Industry 4.0; systematic mapping.

  • Data collection in underwater wireless sensor networks: performance evaluation of FBR and epidemic routing protocols for node density and area shape   Order a copy of this article
    by Elis Kulla 
    Abstract: Data collection in Underwater Wireless Sensor Networks (UWSN) is not a trivial problem, because of unpredictable delays and unstable links between underwater devices. Moreover, when nodes are mobile, continuous connectivity is not guaranteed. Therefore, data collection in UWSN Node scarcity and movement patterns create different environments for data collection in underwater communication. In this paper, we investigate the impact of the area shape and node density in UWSN, by comparing Focused Beam Routing (FBR) and Epidemic Routing (ER) protocols. Furthermore, we also analyse the correlation between different performance metrics. From simulation results we found that when using FBR, delay and delivery probability slightly decrease (2.1%) but the overhead ratio decreases noticeably (46.9%). The correlation between performance metrics is stronger for square area shape, and is not noticeable for deep area shape.
    Keywords: underwater wireless sensor networks; focused beam routing; delay tolerant network; area shape; node density; data collection.

  • Joint end-to-end recognition deep network and data augmentation for industrial mould number recognition   Order a copy of this article
    by RuiMing Li, ChaoJun Dong, JiaCong Chen, YiKui Zhai 
    Abstract: With the booming manufacturing industry, the significance of mould management is increasing. At present, manual management is gradually eliminated owing to need for a large amount of labour, while the effect of a radiofrequency identification (RFID) system is not ideal, which is limited by the characteristics of the metal, such as rust and erosion. Fortunately, the rise of convolutional neural networks (CNNs) brings down to the solution of mould management from the perspective of images that management by identifying the digital number on the mould. Yet there is no trace of a public database for mould recognition, and there is no special recognition method in this field. To address this problem, this paper first presents a novel data set aiming to support the CNN training. The images in the database are collected in the real scene and finely manually labelled, which can train an effective recognition model and generalise to the actual scenario. Besides, we combined the mainstream text spotter and the data augmentation specifically designed for the real world, and found that it has a considerable effect on mould recognition.
    Keywords: mould recognition database; text spotter; mould recognition; data augmentation.

  • An SMIM algorithm for reduction of energy consumption of virtual machines in a cluster   Order a copy of this article
    by Dilawaer Duolikun, Tomoya Enokido, Makoto Takizawa 
    Abstract: Applications can take advantage of virtual computation services independently of heterogeneity and locations of servers by using virtual machines in clusters. Here, a virtual machine on an energy-efficient host server has to be selected to perform an application process. In this paper, we newly propose an SMI (Simple Monotonically Increasing) estimation algorithm to estimate the energy consumption of a server to perform application processes and the total execution time of processes on a server. We also propose an SMIM (SMI Migration) algorithm to make a virtual machine migrate from a host server to a guest server to reduce the total energy consumption of the servers by estimating the energy consumption in the SMI algorithm. In the evaluation, we show the energy consumption of servers in a cluster can be reduced in the SMIM algorithm compared with other algorithms.
    Keywords: server selection algorithm; migration of virtual machines; green computing systems; SMI algorithm; SMIM algorithm.

  • FIAC: fine-grained access control mechanism for cloud-based IoT framework   Order a copy of this article
    by Bhagwat Prasad Chaudhury, Kasturi Dhal, Srikant Patnaik, Ajit Kumar Nayak 
    Abstract: Cloud computing technology provides various computing resources on demand to the user on pay per use basis. The users use the services without the need for establishment and maintenance costs. The technology fails in terms of its usage owing to confidentiality and privacy issues. Access control mechanisms are the tools to prevent unauthorised access to remotely stored data. CipherText Policy Attribute-based Encryption (CPABE) is a widely used tool for facilitating authorised users to access the remotely stored encrypted data with fine-grained access control. In the proposed model FIAC (Fine-Grained Access Control Mechanism for cloud-based IoT Framework ), the access control mechanism is embedded in the cloud-based application to measure and generate a report on the air quality in a city. The major contribution of this work is the design of three algorithms all of which are attribute based: key generation algorithm, encryption and decryption algorithms. Only authorised users can view it to take appropriate action plans. Carbon dioxide concentrations, dust, temperature, and relative humidity are the parameters that we have considered for air quality. To enhance the security of the cloud-based monitoring system, we have embedded a security scheme, all of which are attribute based. Further, the computation time of the model is found to be encouraging so that it can be used in low power devices. The experimental outcomes establish the usability of our model.
    Keywords: air pollution; carbon dioxide concentrations; dust density; internet of things; FIAC; access control.

  • University ranking approach with bibliometrics and augmented social perception data   Order a copy of this article
    by Kittayaporn Chantaranimi, Rattasit Sukhahuta, Juggapong Natwichai 
    Abstract: Typically, universities aim to achieve a high position in ranking systems for their reputation. However, self-evaluating rankings could be costly because the indicators are not only from bibliometrics, but also the results of over a thousand surveys. In this paper, we propose a novel approach to estimate university rankings based on traditional data, i.e., bibliometrics, and non-traditional data, i.e., Altmetric Attention Score, and Sustainable Development Goals indicators. Our approach estimates subject-areas rankings in Arts & Humanities, Engineering & Technology, Life Sciences & Medicine, Natural Sciences, and Social Sciences & Management. Then, by using Spearman rank-order correlation and overlapping rate, our results are evaluated by comparing with the QS subject ranking. From the result, our approach, particularly the top-10 ranking, performed estimating effectively and then could assist stakeholders in estimating the university's position when the survey is not available.
    Keywords: university ranking; rank similarity; bibliometrics; augmented social perception data; sustainable development goals; Altmetrics.

  • Strongly correlated high utility item-sets recommendation in e-commerce applications using EGUI-tree over data stream   Order a copy of this article
    by P. Amaranatha Reddy, M. H. M. Krishna Prasad, S. Rao Chintalapudi 
    Abstract: The product recommendation feature helps the customers to select the right items in e-commerce applications. Grounded on a similar kind of customer purchase history previously, it recommends the items to the customers. The customers may or may not choose from the recommended items list. Suppose the customers like and purchase them, it is added advantage to both the buyers and the sellers. In sellers' sales, they get the benefit of an increase; in addition, the buyers too can save their time to search. Thus, a recommendation system must be designed so that the suggested items have High Utility (HU) and strong correlation to meet such requirements. High profit is exhibited by the HU items and the strongly correlated items have more probability of selection. As these measurements have significant roles in the business process, both of those measurements of items needed to be taken care. Here, since up-to-date data in the data stream are got, the stream of purchase transactions is mined using the sliding window technique to extract such item-sets.
    Keywords: high utility item-set mining; recommendation system; utility; correlation; data stream; sliding window.
    DOI: 10.1504/IJGUC.2023.10057772
  • Communication optimisation of smart agriculture wireless sensor network based on improved ant colony algorithm   Order a copy of this article
    by Zhihui Lin 
    Abstract: Wireless Sensor Network (WSN) is a distributed sensor network that has been popular in recent years and has a wide range of applications It is also very important for the optimization and transformation of traditional agriculture This article analyzes the current problems in the field of agricultural information at home and abroad and briefly introduces the advantages of intelligent agricultural information Based on the study of the ant colony optimization algorithm, research on wireless sensor network routing technology, starting from the perspective of improving network security and reducing network energy consumption, improves the ant colony algorithm and immediately improves the agriculture of this article The study found that the current agricultural output value is only around 250000 yuan, which is difficult to meet farmers' income expectations However, through intelligent transformation of farms and improved ant colony algorithm wireless sensors to intelligentise the farms, the output and benefits.
    Keywords: improved ant colony algorithm; smart agriculture; wireless sensors; network communication optimisation.
    DOI: 10.1504/IJGUC.2023.10061389
  • Automatic remote sensing image mosaic technology of small surveying and mapping UAV based on edge computing   Order a copy of this article
    by Yong Liu 
    Abstract: Aiming at the problems of slow running speed and poor real-time performance of traditional UAV RSI algorithm, this paper proposes an improved algorithm with good real-time performance and fast data processing speed. By studying the applicable conditions and ranges of direct average fusion, weighted average fusion, Poisson fusion and random sample consistency (RANSAC), this paper calculates the image Mosaic results, and proposes an RSI Mosaic fusion optimisation algorithm based on edge computing for small surveying drones. The experimental results show that the RANSAC splicing method adopted in this paper is shorter than the direct splicing algorithm, and the operational efficiency is increased by 70.8%. Compared with the weighted average fusion method, the smoothness is better and the operation efficiency is increased by 60%. Compared with the Poisson fusion algorithm, RANSAC algorithm has less difference in the brightness of the Mosaic, and the efficiency is increased by 40%.
    Keywords: unmanned aerial vehicle remote sensing image; edge computing; Poisson fusion; automatic image mosaic.
    DOI: 10.1504/IJGUC.2023.10061738
  • A proposed model based on k-nearest neighbour classifier with feature selection techniques to control and forecast plant disease   Order a copy of this article
    by Inas Ismael Imran, Rawaa Hamza Ali, Shymaa Mohammed Jameel, Refed Jaleel 
    Abstract: Plant diseases have caused widespread destruction, prompting urgent efforts to find a cure. One of the main issues of today's agricultural managers and decision makers is implementing effective discovery programmers that account for the current health of their crops. As a result, agricultural management starts to pay attention to how these plants are doing. Specifically, to make sure the plant gets the right care at the right time. Since its purpose is to extract information from vast amounts of data, data mining has gained more attention. This study used a dataset of soybeans in order to create a classification model for forecasting plant status using the K-Nearest Neighbours (K-NN) algorithm. The soybean data contains an extensive selection of features. The effectiveness of the classifier drops when the data consist of noisy characteristics. Choose the subset option to get this issue fixed. The right attributes chosen lead to better predictions. Thus, in order to obtain a high-quality model and identify the features that have the greatest impact on the status.
    Keywords: : soya bean data; K-NN; feature selection; classification; evaluating.
    DOI: 10.1504/IJGUC.2023.10061759
  • Group computing task assignment and association analysis based on big data technology   Order a copy of this article
    by Ting Chen 
    Abstract: With the rapid development of computer network technology, the era of big data also follows. The size of the data has skyrocketed. Big data brings convenience to our daily life, allowing us to access massive amounts of knowledge and information at any time. But we can also see that the content of big data is very complex, growing very fast, and in various forms. These characteristics bring serious threats and challenges to traditional data processing. Therefore, it is necessary to adapt the technology of big data processing to the big environment as soon as possible. The technology is divided into two parts: First, humans and machines cooperate to complete group computing. The second is the data self-processing algorithm. This study analyzes and discusses these two aspects. It mainly targets the problem that big data tasks rely heavily on cognitive reasoning. It uses the method of swarm optimization calculation to make this problem be solved efficiently. The purpose of this article is to discuss the group computing task allocation and correlation analysis of big data technology.
    Keywords: big data technology; group computing; task assignment; association analysis; self-processing algorithm.
    DOI: 10.1504/IJGUC.2023.10061845
  • Computer model of display design based on Internet of Things prototype system   Order a copy of this article
    by Jiang Lu 
    Abstract: This article aims to use the Internet of Things prototype system to study the display design computer model, combined with the transmission principle in the Internet of Things technology, use computer digital technology to simulate. This paper proposes to use the Internet of Things technology in conjunction with the use of computers, networks, 4/5G and other technologies on the Internet of Things prototype system, coupled with the traditional display design in art, trade, etc., to conduct display design computer model research. Compared with traditional Internet of Things devices, the combination of IoT technology, computers and networks has a higher transmission speed in terms of data.
    Keywords: Internet of Things technology; prototype system; computer model; automation control technology.
    DOI: 10.1504/IJGUC.2023.10062169
  • Optimisation method of IoT financial data transmission efficiency based on consensus algorithm of blockchain technology   Order a copy of this article
    by Xiaojie Yang 
    Abstract: People are requesting increasingly more dependable and effective data transfer as the Internet of Things era of information development develops Fast and secure financial data transmission has emerged as a pressing issue for the sector, particularly in the area of Internet of Things finance The research develops a novel Internet of Things financial data transmission model based on the blockchain's DPoS consensus algorithm that outperforms the current approach by incorporating the benefits of reputation and gateway nodes The performance test results showed that the honest nodes of the research algorithm were stable at about 91% in the 6th round of voting, while 998 valid blocks were produced at the end of 10 rounds of voting In practical application tests, the lowest transmission rates were 2286 bps and 536 bps for different financial data sets, and the highest packet loss rates were 0.12% and 1.24%, respectively. 1.24%. Taken together, the above data illustrates that the research model is extremely secure and stable, and has high-data transmission efficiency and low packet loss. The research results and conclusions can provide more practical, secure and efficient solutions for data transmission problems in related fields.
    Keywords: internet of things; consensus algorithm; data transfer; finance; blockchain.
    DOI: 10.1504/IJGUC.2023.10062463
  • Application evaluation of mechanical and electronic diagnosis technology based on edge computing in engineering inspection and maintenance   Order a copy of this article
    by Xu Zhao, Yetong Wang 
    Abstract: How to improve the efficiency and quality of engineering maintenance and reduce the maintenance cost is the focus of modern engineering maintenance. In this paper, the electronic technology based on edge computing is selected to optimise the application of diagnostic technology in engineering inspection and maintenance, and edge computing and short-time Fourier transform are used to help electronic diagnostic technology improve the efficiency of signal recognition. At the same time, the traditional diagnosis technology is compared with the mechatronics diagnosis technology based on edge computing in this paper. The results show that the mechatronics diagnosis technology based on edge computing is significantly better than the traditional diagnosis technology in cost reduction ability, and the mechatronics diagnosis technology based on edge computing is 112.5% higher than the traditional diagnosis technology
    Keywords: electronic diagnostic technology; edge computing; engineering overhaul; machinery and electronics; engineering inspection.
    DOI: 10.1504/IJGUC.2023.10062464
  • Application of artificial intelligence AR enhancement technology based on mobile edge computing in three-dimensional image generation   Order a copy of this article
    by Xiaofei Ren 
    Abstract: In order to solve the problems of slow data processing speed and low quality of image generation encountered by traditional algorithms in 3D image generation, this paper analyses the application of augmented reality (AR) technology based on mobile edge computing and artificial intelligence in 3D image generation, discusses the mobile terminal model that can automatically generate 3D images, and conducts comparative experiments. The experimental results show that the brightness of samples 14 is improved after the artificial intelligence AR enhancement technology based on moving edge computing, and the brightness of 3D images is 89.66%, 88.17%, 88.72% and 89.23%, respectively. Experiments show that the application of AR enhancement technology based on moving edge computing and artificial intelligence to 3D image generation can effectively improve the data processing speed in the process of image generation, enhance the brightness and contrast of 3D images, and ensure the quality of image generation.
    Keywords: moving edge calculation; artificial intelligence; augmented reality; three-dimensional image.
    DOI: 10.1504/IJGUC.2023.10062465
  • Application of intelligent system based on deep reinforcement learning in electrical engineering automation control   Order a copy of this article
    by Zhihe Wu 
    Abstract: The application of intelligent control technology in the power electronics industry has promoted the development of power automation It not only changes its control and management mode, but also greatly improves work efficiency However, in the power automation system, it is necessary to fully consider its use efficiency according to the actual situation, and gradually promote the application of intelligent technology in power automation technology This paper focused on analyzing and discussing the application of intelligent technology in power system, and provided reference for future work On this basis, a method of applying deep reinforcement learning method to automatic control of power engineering was proposed This paper introduces a learning algorithm based on artificial emotion augmentation to improve the operational performance of power grids. The relationship between artificial emotion and reinforcement learning in artificial psychology was discussed from three perspectives: behaviour value selection, Q-value matrix update and reward value function update. According to the experimental and calculation, in the ACE simulation results, it can be known that the Q-learning method, the Q() method, and the DQL method are reduced by 39.7%, 55.8%, and 61.7%, respectively; in the f simulation results, the Q-learning algorithm, the Q()-learning algorithm and the deep Q-learning method are 58.3%, 75% and 75% lower than PID, respectively. Simulation experiments showed that the algorithm outperforms the other three algorithms.
    Keywords: electrical engineering automation control; deep reinforcement learning; smart system; Q-learning algorithm; traditional power system.
    DOI: 10.1504/IJGUC.2023.10062640
  • MOSQ-charge: a smart mosquito repellent wireless charger   Order a copy of this article
    by Shashi Bhushan, Manoj Kumar, Vinod Raturi, Rahim Mutlu 
    Abstract: The Internet of Things (IoT) has the potential to connect everything in our daily lives, including the devices we use and those around us. However, one significant issue that needs to be addressed is the charging problem faced by mosquito repellent devices. In this paper, we focus on the wireless charging of smart mosquito repellents using a system called MOSQ-Charge. MOSQ-Charge is built on the IoT architecture and comprises five major sections. The first section is the wireless charger, which connects the device to the internet. The second section is the Smart IoT Gateway that establishes the connection, followed by cloud services for storing and retrieving information. The third section involves developing an algorithm for securing the connection, and the fourth section is the application used for controlling the device. The battery level of the device can be monitored using sensors, and data from the server can detect the device. The wireless charger is connected to the gateway, and the gateways are linked to the Cloud server using an IoT connectivity protocol named MQTT.
    Keywords: internet of things; IoT protocols: message queue telemetry Transport; constrained application protocol; wireless mosquito repellent charging.
    DOI: 10.1504/IJGUC.2023.10062641
  • A study on financial early warning for technology companies incorporating big data and random forest algorithms   Order a copy of this article
    by Xuemei Wang 
    Abstract: The research is based on random forest algorithm, combined with SMOTE algorithm to construct financial prediction identification and prediction of technology enterprises, to assess the financial risk of technology enterprises, and to determine whether there is a possibility of financial fraud in technology enterprises and possibility of financial fraud The results show that the research model effectively predicts accuracy and precision, and effectively identifies 17 risky enterprises when testing identification on the dataset, and there is no misidentification of financially risky enterprises as normal enterprises Of the 119 FST firms in the real sample, 123 normal firms were correctly identified, and overall the model had a high accuracy rate in prediction The ACC metric of the study model was 97 2%, the accuracy rate of the study model reached 96 8%, the recall metric of the study model reached 1 and the F1 metric of the study model reached 98.4%.
    Keywords: financial fraud; financial malpractice; machine learning; random forest algorithm; SMOTE algorithm.
    DOI: 10.1504/IJGUC.2024.10062665
  • Design of real-time digital image processing system based on high performance computing   Order a copy of this article
    by Hongxing Sun, Yingwei Zhang, Wei Teng 
    Abstract: Nowadays, digital image processing systems are developing towards faster data transmission and processing speed, smaller size, higher real-time performance, and more flexible and convenient programming. How to improve the performance of digital image processing systems has become a concern for people. Starting from the actual project development needs, this article utilized high-performance computing technology and combined the real-time processing characteristics of images. With TMS320C6455 as the core framework, FPGA (Field Programmable Gate Array)+DSP (Digital Signal Processing) technology was used to achieve high-performance computing and real-time image processing in digital image processing systems. This article used TMS320C6455 embedded in FPGA to achieve data caching and data sharing between nodes. On this basis, real-time preprocessing of digital images was carried out using FPGA, and a high-speed serial interface with FPGA was used to increase the transmission bandwidth of the system. In the real-time tracking testing of images, the high-performance real-time digital image processing system used in this article took an average of 0.00425s per frame of image, and 0.00235s in the process of feature extraction and matching.
    Keywords: real-time digital image processing system; high-performance computing; image features; digital image pre-processing.
    DOI: 10.1504/IJGUC.2024.10062729
  • Multi-source heterogeneous data storage methods for omnimedia data space   Order a copy of this article
    by Wenbo Zhuo 
    Abstract: This article mainly introduces the design of a multi source heterogeneous dataset fusion system, and discussed the key technologies and requirements of the system Its application fields were discussed in detail, and the system functions and performance were briefly described Finally, a multi source heterogeneous data storage and retrieval system for omnimedia space was designed, effectively improving the accuracy of cross modal retrieval Comparing traditional storage methods with multi source heterogeneous storage methods, the results showed that both storage modes could better meet practical work needs Storage technology based on multiple sources and heterogeneity is conducive to improving user experience and satisfaction, and providing more complete services for enterprises.
    Keywords: omnimedia data space; multi-source isomerism; data storage; data fusion; retrieval speed.
    DOI: 10.1504/IJGUC.2023.10062784
  • Fast communication technology for GOOSE preemptive transmission in large background traffic   Order a copy of this article
    by Feng Liao, Weichao Ou, Jinrong Chen, Yueqiang Wang, Shuaibing Wang 
    Abstract: Whether the distribution network can achieve fast transmission in high traffic is a critical issue To address the problem of poor real-time performance in the application of GOOSE transmission in distribution networks, the study proposes a GOOSE on TCP message transmission scheme and factors that affect the real-time performance of transmission, such as the number of nodes, the indebtedness of the network, and the size of the message Three improvement measures are specified, namely, increasing the priority of GOOSE messages, shutting down the TCP protocol using the Negal algorithm, and maintaining the TCP communication link, while a channel model for the data link layer and an optimization method for avoiding interference have been specified The maximum value of the transmission delay is about 1 3ms, the variance of the delay distribution is small, and the processing delay occupies too much weight
    Keywords: distributed control; GOOSE; TCP; distribution networks; transmission schemes.
    DOI: 10.1504/IJGUC.2023.10063133
  • WHBO: War honey badger optimisation enabled load balancing in IoT-cloud-fog computing   Order a copy of this article
    by M.N. Babitha, M. Siddappa 
    Abstract: Load balancing scheme attains minimal time for processing and has minimal response time. A main concept of load balancing is tasks are allocated and reallocated between available resources. In this work, IoT-cloud-fog is simulated initially and then, tasks are allocated to VM in round robin manner for each region. A workload of VM is computed based on memory, bandwidth, CPU and migration time. If the computed workload is greater than a threshold value, load balancing is conducted. The load balancing is performed by tasks allocation from user to VMs in a region on basis of resource constraints optimally utilizing proposed WHBO and then the tasks are assigned to other underloaded VMs. The objectives considered are energy consumption, predicted resources, execution time, cost, trust, resource utilization and bandwidth. Here, the resources are predicted employing DQN. Moreover, devised WSOHB is newly introduced by combining WSO and HBA.
    Keywords: virtual machine; internet of things; deep Q-network; war strategy optimisation; honey badger algorithm.
    DOI: 10.1504/IJGUC.2024.10063195
  • Moving object detection for surveillance video frames using two stage multi-scale residual convolution neural networks   Order a copy of this article
    by Anshul Khairwa, Arunkumar Thangavelu 
    Abstract: Video data is becoming more crucial source to identify various activities in the real time scenarios. Detecting moving objects from the surveillance video data is tedious task using the traditional techniques. In this paper a two stage moving object detection methodology is proposed using Multi Scale Residual Block based Convolution Neural Networks (MSRB-CNN). The first stage is designed with a sequence of convolution layers with the CNN layers to extract the feature maps of interested regions of frames. In the second stage the refinement of feature maps to segment the foreground objects of the frames using the couple of MSRB layers. To increase the resolution of the output feature maps De-convolution layers are used. The proposed method has achieved 98.96 percent precision to detect the moving objects.
    Keywords: video analytics; CNN; residual networks; surveillance videos; multi scale CNN.
    DOI: 10.1504/IJGUC.2023.10063329
  • Hierarchical access control of supply chain data based on blockchain technology   Order a copy of this article
    by Qiao Li 
    Abstract: In order to strengthen the degree of data sharing between enterprises and departments in the supply chain and improve the data access transparency and privacy protection ability, a hierarchical access control mechanism for supply chain data based on block chain was proposed. A multi-chain architecture for supply chain scenario is designed to realize the isolated storage of supply chain data and access control information; At the same time, a hierarchical access control model based on hierarchical attribute and block chain is proposed, and the realization and deployment of smart contract are also presented. The experimental results show that the throughput of this mechanism is still above 90tps and the average policy decision time cost is 26ms. The proposed mechanism still has relatively stable performance under the condition of a large number of strategies and is feasible in the actual scenario of data access and sharing in the supply chain.
    Keywords: blockchain; supply chain; data classification; access restriction; strategy.
    DOI: 10.1504/IJGUC.2024.10063352
  • Patent personalised recommendation method based on fusing co-occurrence network and point mutual information   Order a copy of this article
    by Na Deng, Chang Liu 
    Abstract: In order to promote patent transformation and promote corporation development. In this paper, we propose a recommendation method based on co-occurrence network and pointwise mutual information coefficient (PMI). Firstly, a word co-occurrence network is constructed by utilizing the text of patent abstracts to effectively capture the co-occurrence relationship of nodes in the network; then the network is weighted using point mutual information to measure the degree of connection between nodes from data perspective; finally, patent personalized recommendation is carried out based on network modulization to provide users with more accurate recommendation results. Through experiments with data from the communication industry, the effectiveness of the approach proposed in this paper in the field of patent recommendation has been validated. It offers new insights for improving patent conversion rates and promoting the application of technological achievements.
    Keywords: patent personalized recommendation;co-occurrence network;point mutual information coefficient; text clustering.
    DOI: 10.1504/IJGUC.2024.10063591
  • A model for simulating residential location choices of working women considering compact city planning   Order a copy of this article
    by Kaoru Fujioka 
    Abstract: In Japan, the aging of the population and declining birthrate have led to a shortage of workers, making it necessary to consider more diverse work styles and create new urban structures that accommodate a variety of lifestyles. In this study, we developed a model to simulate the residential location choices of working women in the context of compact city planning. Our results showed that land costs can impact the compactization of cities and that housing expenses and preferences for location are important factors in residential location decisions. We also found that differences in overall residential trends were based on housing expenses and preferences for location, rather than household type. Our model highlights the importance of addressing land costs and household preferences in compact city planning for working women and other diverse households.
    Keywords: residential location choices; compact city planning; multi-agent simulation.
    DOI: 10.1504/IJGUC.2023.10063738
  • Construction of game 3D modelling simulation training platform based on mobile edge computing and VR technology   Order a copy of this article
    by Guangning Xia 
    Abstract: Purpose: The purpose is to solve the problems of difficulty in learning and low user satisfaction in the design process of the game 3D training platform. Methods: This paper used mobile edge computing and VR technology to study the construction of the game 3D modeling simulation training platform. By using mobile edge computing, users can experience faster games by improving the running speed of the 3D game training platform. Results: The experiment proved that the game training platform built by mobile edge computing and VR technology can improve the user’s game level. User A scored an average of 8.02 points, 5.22 points, and 6.89 points higher than User B for 10 games of Game Q, Game W, and Game P. At the same time, it can also improve the satisfaction rate of platform users. Conclusion: Using mobile edge computing and VR to build a game training platform can improve users’ game level and users’ satisfaction with the platform.
    Keywords: game training platform; virtual reality technology; mobile edge computing; genetic algorithm.
    DOI: 10.1504/IJGUC.2024.10064243
  • Information system operational efficiency prediction algorithm based on deep learning   Order a copy of this article
    by Dayong Chang, Xiaofeng Gao, Yongqiang Guo, Du Wang 
    Abstract: Purpose: With the rapid development of information technology, the business processing, production and sales of enterprises are inseparable from information systems. The enterprise information system determines the competitiveness of the enterprise, and operational efficiency can reflect the development status of the enterprise. However, traditional manual prediction methods are difficult to accurately predict and analyse the operational efficiency of the enterprise. Method: This article has applied deep learning technology to predict the operational efficiency of enterprise information systems, fully trained and learned the data generated by the enterprise information system. This article has also used Back Propagation Neural Network (BPNN) and Deep Belief Network (DBN) to analyse the total assets, operating expenses, investment expenses, operating income, operating profit and other data of enterprises, in order to predict the operational efficiency of enterprises Result: This article trained the data of five retail listed companies in China's A-share market, and the results showed that the average prediction accuracy of operating efficiency using BPNN algorithm was 97.72%, while the average prediction accuracy of operating efficiency using DBN algorithm was 98.88%. The DBN algorithm has good computational efficiency and predictive performance in enterprise information system data analysis.
    Keywords: operating efficiency; enterprise information system; deep learning; back propagation neural network; deep belief network.
    DOI: 10.1504/IJGUC.2024.10064309
  • A new full-duplex wireless MAC protocol for inducing parallel transmissions between neighbours   Order a copy of this article
    by Hikari Hashimoto, Tetsuya Shigeyasu 
    Abstract: With the development of signal processing technologies, full-duplex wireless transmission has attracted attention as a promising method to increase network throughput. In this study, we discuss a new Medium Access Control (MAC) protocol designed to improve throughput by inducing parallel transmissions by neighbors of an initial full-duplex transmitter (initiator). In our proposed approach, an initiator informs candidate terminals of an opportunity for parallel transmissions to avoid interference between transmissions. We also present the results of computer simulations to demonstrate that our proposed method may be expected to improve throughput performance effectively even under network conditions that also involve traditional terminals performing half-duplex wireless communication.
    Keywords: full-duplex wireless communication; media access control protocol; relay-based transmission networks.
    DOI: 10.1504/IJGUC.2023.10064381

Special Issue on: CONIITI 2019 Intelligent Software and Technological Convergence

  • Computational intelligence system applied to plastic microparts manufacturing process   Order a copy of this article
    by Andrés Felipe Rojas Rojas, Miryam Liliana Chaves Acero, Antonio Vizan Idoipe 
    Abstract: In the search for knowledge and technological development, there has been an increase in new analysis and processing techniques closer to human reasoning. With the growth of computational systems, hardware production needs have also increased. Parts with millimetric to micrometric characteristics are required for optimal system performance, so the demand for injection moulding is also increasing. Injection moulding process in a complex manufacturing process because mathematical modelling is not yet established: therefore, to address the selection of correct values of injection variables, computational intelligence can be the solution. This article presents the development of a computational intelligence system integrating fuzzy logic and neural network techniques with CAE modelling system to support injection machine operators, in the selection of optimal machine process parameters to produce good quality microparts using fewer processes. The tests carried out with this computational intelligent system have shown a 30% improvement in the efficiency of the injection process cycles.
    Keywords: computational intelligence; neural networks; fuzzy logic; micro-parts; plastic parts; computer vision; expert systems; injection processes; CAD; computer-aided design systems; CAE; computer-aided engineering.

Special Issue on: ICIMMI 2019 Emerging Trends in Multimedia Processing and Analytics

  • An optimal channel state information feedback design for improving the spectral efficiency of device-to-device communication   Order a copy of this article
    by Prabakar Dakshinamoorthy, Saminadan Vaitilingam 
    Abstract: This article introduces a regularised zero-forcing (RZF) based channel state information (CSI) feedback design for improving the spectral efficiency of device-to-device (D2D) communication. This proposed method exploits conventional feedback design along with the optimised CSI in regulating the communication flows in the communicating environment. The codebook-dependent precoder design improves the rate of feedback by streamlining time/frequency dependent scheduling. The incoming communication traffic is scheduled across the available channels by pre-estimating their adaptability and capacity across the underlying network. This helps to exchange partial channel information between the communicating devices without the help of base station services. These features reduce the transmission error rates to achieve better sum rate irrespective of the distance and transmit power of the devices.
    Keywords: CSI; D2D; feedback design; precoding; zero-forcing.

Special Issue on: AMLDA 2022 Applied Machine Learning and Data Analytics Applications, Challenges, and Future Directions

  • Fuzzy forests for feature selection in high-dimensional survey data: an application to the 2020 US Presidential Election   Order a copy of this article
    by Sreemanti Dey, R. Michael Alvarez 
    Abstract: An increasingly common methodological issue in the field of social science is high-dimensional and highly correlated datasets that are unamenable to the traditional deductive framework of study. Analysis of candidate choice in the 2020 Presidential Election is one area in which this issue presents itself: in order to test the many theories explaining the outcome of the election, it is necessary to use data such as the 2020 Cooperative Election Study Common Content, with hundreds of highly correlated features. We present the fuzzy forests algorithm, a variant of the popular random forests ensemble method, as an efficient way to reduce the feature space in such cases with minimal bias, while also maintaining predictive performance on par with common algorithms such as random forests and logit. Using fuzzy forests, we isolate the top correlates of candidate choice and find that partisan polarisation was the strongest factor driving the 2020 Presidential Election.
    Keywords: fuzzy forests; machine learning; ensemble methods; dimensionality reduction; American elections; candidate choice; correlation; partisanship; issue voting; Trump; Biden.

  • An efficient intrusion detection system using unsupervised learning AutoEncoder   Order a copy of this article
    by N.D. Patel, B.M. Mehtre, Rajeev Wankar 
    Abstract: As attacks on the network environment are rapidly becoming more sophisticated and intelligent in recent years, the limitations of the existing signature-based intrusion detection system are becoming more evident. For new attacks such as Advanced Persistent Threat (APT), the signature pattern has a problem of poor generalisation performance. Research on intrusion detection systems based on machine learning is being actively conducted to solve this problem. However, the attack sample is collected less than the normal sample in the actual network environment, so it suffers a class imbalance problem. When a supervised learning-based anomaly detection model is trained with these data, the results are biased toward normal samples. In this paper, AutoEncoder (AE) is used to perform single-class anomaly detection to solve this imbalance problem. The experimental evaluation was conducted using the CIC-IDS2017 dataset, and the performance of the proposed method was compared with supervised models to evaluate the performance
    Keywords: intrusion detection system; advanced persistent threat; CICIDS2017; AutoEncoder; machine learning; data analytics.

  • Optimal attack detection using an enhanced machine learning algorithm   Order a copy of this article
    by Reddy Saisindhutheja, Gopal K. Shyam, Shanthi Makka 
    Abstract: As computer network and internet technologies advance, the importance of network security is widely acknowledged. Network security continues to be a substantial challenge within the cyberspace network. The Software-as-a-Service (SaaS) layer describes cloud applications where users can connect using web protocols such as hypertext transfer protocol over transport Layer Security. Worms, SPAM, Denial-of-service (DoS) attacks or botnets occur frequently in the networks. In this light, this research intends to introduce a new security platform for SaaS framework, which comprises two major phases: 1) optimal feature selection and (2) classification. Initially, the optimal features are selected from the dataset. Each dataset includes additional features, which further leads to complexity. A novel algorithm named Accelerator updated Rider Optimisation Algorithm (AR-ROA), a modified form of ROA and Deep Belief Network (DBN) based attack detection system is proposed. The optimal features that are selected from AR-ROA are subjected to DBN classification process, in which the presence of attacks is determined. The proposed work is compared against other existing models using a benchmark dataset, and improved results are obtained. The proposed model outperforms other traditional models, in aspects of accuracy (95.3%), specificity (98%), sensitivity (86%), precision (92%), negative predictive value (97%), F1-score (86%), false positive ratio (2%), false negative ratio (10%), false detection ratio (10%), and Matthews correlation coefficient (0.82%).
    Keywords: software-as-a-service framework; security; ROA; optimisation; DBN; attack detection system.
    DOI: 10.1504/IJGUC.2022.10064192

Special Issue on: Cloud and Fog Computing for Corporate Entrepreneurship in the Digital Era

  • Enhanced speculative approach for big data processing using BM-LOA algorithm in cloud environment   Order a copy of this article
    by Hetal A. Joshiara, Chirag S. Thaker, Sanjay M. Shah, Darshan B. Choksi 
    Abstract: In the event that one of the several tasks is being allocated to an undependable or jam-packed machine, a hugely parallel processing job can be delayed considerably. Hence, the majority of the parallel processing methodologies, namely (MR), have espoused diverse strategies to conquer the issue called the straggler problem. Here, the scheme may speculatively introduce extra copies of a similar task if its development is unnaturally slow when an additional idling resource is present. In the strategies-centred processes, the dead node is exhibited. the (RT) along with backup time of the slow task is assessed. The slow task is rerun with the aid of BM-LOA subsequent to the evaluation. In both heterogeneous and homogeneous environments, the proposed approach is performed. Centred on the performance metrics, the proposed research techniques performance is scrutinised in experimental investigation. Thus, when weighed against the other approaches, the proposed technique achieves superior performance.
    Keywords: modified exponentially weighted moving average; speculative execution strategy; Hadoop supreme rate performance; big data processing; rerun.
    DOI: 10.1504/IJGUC.2022.10063581
  • Importance of big data for analysing models in social networks and improving business   Order a copy of this article
    by Zhenbo Zang, Honglei Zhang, Hongjun Zhu 
    Abstract: The digital revolution is the build-up of many digital advances, such as the network phenomena transformation. To increase business productiveness and performance, participatory websites that allow active user involvement and the intellectual collection became widely recognized as a value-added tool organization of all sizes. However, it has lacked an emphasis on business and management literature to promote profitable business-to-business (b2b). User knowledge is a way to consider the kinds of search ions that big data can use. The use of social media data for analysing business networks is a promising field of research. Technological advances have made the storage and analysis of large volumes of data commercially feasible. Big information represents many sources of organised, half-structured, and unstructured data in real-time. Big data is a recent phenomenon in the computing field, which involves technological or market study. The management and research of a vast network provide all companies with significant advantages and challenges. The volume of information floating on social networks grows every day and, if processed correctly, provides a rich pool of evidence.
    Keywords: big data; business-to-business; data analytics; text mining; social networks.

  • Study on the economic consequences of enterprise financial sharing model   Order a copy of this article
    by Yu Yang, Zecheng Yin 
    Abstract: Using enterprise system ideas to examine the business process requirements of firms, the Financial Enterprise Model (FEM) is a demanding program. This major integrates finance, accounting, and other critical business processes. Conventional financial face difficulties due to low economic inclusion, restricted access to capital, lack of data, poor R&D expenditures, underdeveloped distribution channels, and so on. This paper mentions making, consuming, and redistributing goods through collaborative platform networks. These three instances highlight how ICTs (Information and Communication Technologies) can be exploited as a new source of company innovation. The sharing economy model can help social companies solve their market problems since social value can be embedded into their sharing economy cycles. As part of the ICT-based sharing economy, new business models for social entrepreneurship can be developed by employing creative and proactive platforms. Unlike most public organizations, double-bottom-line organizations can create social and economic advantages. There are implications for developing and propagating societal values based on these findings.
    Keywords: finance; economy; enterprise; ICT; social advantage.