Forthcoming articles

International Journal of Cloud Computing

International Journal of Cloud Computing (IJCC)

These articles have been peer-reviewed and accepted for publication but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.

Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.

Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.

Register for our alerting service, which notifies you by email when new issues are published online.

Open AccessArticles marked with this Open Access icon are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.
We also offer which provide timely updates of tables of contents, newly published articles and calls for papers.

International Journal of Cloud Computing (70 papers in press)

Regular Issues

  • Optimization of Automatic Web Services Composition Using Genetic Algorithm   Order a copy of this article
    by Mirsaeid Hosseini Shirvani 
    Abstract: In recent years, with the expansion of organizations, service-oriented architecture is known as an effective tool for creating applications. Hence, the need to use web services in organizations to reduce costs is felt more than ever. The purpose of web service composition is to determine a proper mix of user requests that cannot be met by a simple web service. In this paper, a genetic-based algorithm is proposed for combining cloud services that ensures multiple clouds work efficiently. The proposed method also provides an overview of the weaknesses of other available methods in terms of computational complexity in automated selection of web services and makes it possible to fulfill the demands of the composition of web services in a more optimal way. It is worth noting that the simulation results show the superiority of the proposed method compared to other methods analyzed in the paper. Keywords: Web Service, Web Services Composition, Service-Oriented Architecture, Quality of Service.
    Keywords: Web Service; Web Services Composition; Service-Oriented Architecture; Quality of Service.

  • A Secure and efficient multi cloud-based data storage and retrieval using hash-based verifiable secret sharing scheme   Order a copy of this article
    by Majid Farhadi, Hamideh Bypour, Seyyed Erfan Asadi 
    Abstract: As the availability of many smart devices rises, fast and easy access to data as well as sharing more information is felt. Cloud computing is a computational approach that shares configurable resources such as network, servers, storage space, applications and services on the Internet, and allows the user to access services without the expertise or control of the technology infrastructure. The confidentiality, integrity, and availability of the data, reducing computational cost and communication channel between the data owner (user) and cloud service providers (CSPs) are essential parts of cloud computing. In the paper, we propose a new scheme to construct a secure cloud data storage based on the verifiable secret sharing scheme with public verifiability to protect data integrity. In the new scheme, the validity of secret shares can be publicly verified without leaking the privacy of secret shares in the verification phase. Moreover, the verification phase does not depend on any computational assumptions. Furthermore, the proposed scheme cannot only detect the cheating but also identify who are the cheaters. It is worth noting that the proposed scheme is more efficient compared with the other secret sharing-based cloud data storage since heavy and complex computation is not required.
    Keywords: Cloud computing; cloud data storage; verifiable secret sharing scheme; public verifiability; hash function.

  • Stream of Traffic Balance in Active Cloud Infrastructure Service Virtual Machines Using Ant Colony   Order a copy of this article
    by Ankita Taneja, Hari Singh, Suresh Chand Gupta 
    Abstract: Cloud load balancing is the manner of distributing computing resources and workloads over a cloud computing infrastructure. It allows an enterprise to manage workloads through appropriate resource allocation in the cloud. Various load balancing techniques in cloud computing are reviewed and the work presented in this paper thoroughly analyzes and compares two well-known algorithms in MATLAB, the Ant Colony Optimization (ACO) Algorithm and Genetic Algorithm (GA). The objective is to produce an optimal solution for cost and execution time through balancing the workload. It is observed through experimental observations that ACO based load balancing possess incurs low cost and low execution time as compared to the GA for a constant workload over a fixed number of cloud machines. However, the execution time follows a different trend when workload increases and more machines are utilized to handle the increased workload; it rises sharply in ACO as compared to the GA.
    Keywords: ACO; ant colony optimization; GA; genetic algorithm; load balancing; cloud computing; pheromone matrix; pheromone table; IAAS; infrastructure as a service.

  • Memory constraint Parallelized resource allocation and optimal scheduling using Oppositional GWO for handling big data in cloud environment   Order a copy of this article
    by Chetana Tukkoji, Seetharam Keshav Rao 
    Abstract: In cloud computing, task scheduling is one of the challenging troubles, especially when deadline and cost are conceived. On the other hand, the key issue of task scheduling is to reach optimal allocation of users tasks for to optimize the task scheduling performance and reduce non-reasonable task allocation in clouds. Besides, in terms of memory space and time complexities, the processing of huge number of tasks with sequential algorithm results in greater computational cost. Therefore, we have improved an efficient Memory constraint Parallelized resource allocation and optimal scheduling method applying Oppositional GWO for resolving the scheduling trouble of big data in cloud environment by this paper. In parallel over distributed systems, the suggested scheduling approach applies the MapReduce framework to perform scheduling. The Map Reduce framework is consisted of two main processes; particularly, the task prioritization stage (with Fuzzy C-means Clustering method based on memory constraint) in Map phase and optimal scheduling (using Oppositional Grey Wolf Optimization algorithm) in reduce phase. Here, the scheduling is maximized to reduce the makespan, cost and to raise the system utilization.
    Keywords: Oppositional Grey Wolf Optimization algorithm; Fuzzy C-means Clustering; MapReduce; Task Prioritization; Virtual Machine Allocation; Apache Spark Distributed file System (SDFS).

  • An efficient load balancing scheduling strategy for cloud computing based on hybrid approach   Order a copy of this article
    by Mohammad Oqail Ahmad, Rafiqul Zaman Khan 
    Abstract: Cloud computing is a promising paradigm that is widely used in both academia and industry. Dynamic demand for resources by users is one of the prime goals of scheduling process of task in cloud computing. Task scheduling is NP-hard problem which is responsible for allocating the task to VMs and maximizing their utilization while minimizing the total task execution time. In this paper, the authors propose a load balancing scheduling strategy, Hybridization of RALB method using the PSO technique inspired by the honeybee behaviour proposed named as (PSO-RALB). This strategy optimize the results and perform scheduling based on resource aware load balancing scheme. The foraging behaviour of the honey bee optimization algorithm is utilized to balance load across VM and resource aware is used to manage the resources. The computational results show that proposed scheme minimize the makespan time, total processing time, total processing cost and the degree of imbalance factor when compared with existing techniques PSO standard and PSO based Load Balancing (PSO-LB) algorithms.
    Keywords: Cloud computing; Load balancing; Honey bee foraging; Particle Swarm Optimization; PSO-RALB Algorithm; Degree of imbalance;.

  • End-to-End SLA Management in Federated Clouds   Order a copy of this article
    by Asma Al Falasi, Mohamed Adel Serhani, Younes Hamdouch 
    Abstract: Cloud services have always promised to be available, flexible, and speedy. However, in some circumstances (e.g. drastic changes in application requirements) a Cloud provider might fail to deliver such promises to their distinctly demanding customers. Cloud providers have a constrained geographical presence and are willing to invest in infrastructure only when it is profitable to them. Cloud federation is a concept that collectively combines segregated Cloud services to create an extended pool of resources for Clouds to competently deliver their promised level of services. This paper is concerned with studying the governing aspects related to the federation of Clouds through collaborative networking. We propose a network of federated Clouds, CloudLend, that creates a platform for Cloud providers to collaborate, and for customers to expand their service selections. We also define and specify a service level agreement (SLA) management model in order to govern and administer the relationships established between different Cloud services in CloudLend. We define a multi-level SLA specification model to describe QoS terms, in addition to a game theory-based automated SLA negotiation model. We also define an adaptive agent-based SLA monitoring model. Formal verification proved that our proposed framework assures customers with maximum optimized guarantees to their QoS requirements, in addition to supporting Cloud providers to make informed resource utilization decisions. Additionally, simulation results demonstrate the effectiveness of our SLA management model.
    Keywords: Cloud Computing; Federated Clouds; SLA Management; Game Theory; QoS Requirements.

  • A Cloud Data Collection Platform for Canine Behavioral Prediction using Objective Sensor Data   Order a copy of this article
    by Zachary Cleghern, Marc Foster, Sean Mealin, Evan Williams, Timothy Holder, Alper Bozkurt, David Roberts 
    Abstract: Training successful guide dogs is time and resource intensive, requiring copious professional and volunteer labor. Even among the best programs, dogs are released with attrition rates commonly at 50\%. Increasing success rates enables non-profits to meet growing demand for dogs and optimize resources. Selecting dogs for training is a crucial task; guide dog schools can benefit from both better selection accuracy and earlier prediction. We present a system aimed at improving analysis and selection of which dogs merit investment of resources using custom sensing hardware and a cloud-hosted data processing platform. To improve behavioral analysis at early stages, we present results using objective data acquired in puppy behavioral tests and the current status of an IoT-enabled ``Smart Collar'' system to gather data from puppies while being raised by volunteers prior to training. Our goal is to identify both puppies at risk and environmental influences on success as guide dogs.
    Keywords: Cloud Computing; Canine Behavior; Behavioral Prediction; Sensor Data; Internet-of-Things; Machine Learning; Wearable Computing; Guide Dogs.

  • Evaluation and Selection of Cloud deployment models using Fuzzy Combinative Distance-Based Assessment   Order a copy of this article
    by Nandini Kashyap, Rakesh Garg 
    Abstract: Cloud computing (CC) is an innovative technology that is completely transforming the way of individuals to collect, share and approach their data files. Although, CC technology provides many benefits such as elasticity, resource pooling and on-demand services, yet there arise various issues and challenges for the successful implementation of this technology. Evaluation and selection of cloud deployment models (CDMs) are one of challenges highly faced by the cloud practitioners. The present study addresses the CDMs evaluation and selection problem in the education sector by modeling it as a multi-criteria decision making (MCDM) problem. To solve this selection problem, a hybrid MCDM approach, namely, Fuzzy-Combinative Distance-based Assessment (Fuzzy-CODAS) is proposed. The proposed approach works on the calculation of desirability index value for each of the alternatives based on Euclidean and Hamming distances from the negative ideal solution. Finally, the alternatives are ranked on their desirability index values. The alternative having maximum value of desirability index is placed at top position, whereas alternative with minimum value is placed at the last position.
    Keywords: Cloud Computing; Cloud deployment models; Multi-criteria decision making; Fuzzy- Combinative distance based assessment; Academic Organization.

  • Docker-pi: Docker Container Deployment in Fog Computing Infrastructures   Order a copy of this article
    by Arif Ahmed, Guillaume Pierre 
    Abstract: The transition from virtual machine-based infrastructures to containerbased ones brings the promise of swift and efficient software deployment in largescale computing infrastructures. However, in fog computing environments which are often made of very small computers such as Raspberry PIs, deploying even a very simple Docker container may take multiple minutes. We demonstrate that Docker makes inefficient usage of the available hardware resources, essentially using different hardware subsystems (network bandwidth, CPU, disk I/O) sequentially rather than simultaneously.We therefore propose three optimizations which, once combined, reduce container deployment times by a factor up to 4. These optimizations also speed up deployment time by about 30% in datacentergrade servers.
    Keywords: Docker; Container; Fog Computing; Deployment.

  • Performance evaluation & Reliability analysis of predictive hardware failure models in Cloud platform using ReliaCloud-NS   Order a copy of this article
    by Rohit Sharma 
    Abstract: Cloud Computing Systems at the present time established as a promising trend in providing the platform for coordinating large number of heterogeneous tasks and aims at delivering highly reliable cloud computing services. It is most necessary to consider the reliability of cloud services and timely prediction of failing hardware in Cloud Data Centre's so that it ensures correct identification of the overall time required before resuming the service after the failure. In this paper reliability of two recently introduced predictive hardware failure models has been analysed, first model is on the basis of two open data sources i.e. Self-Monitoring And Reporting Technology (SMART), Windows performance counters and second model is based on FailureSim which is a neural networks based system for predicting hardware failures in data centres is done over our carefully designed two Test Cloud simulations of 144 VM's & 236 VM's. The results are thoroughly compared and analysed with the help of ReliaCloud- NS that allow researchers to design a CCS and compute its reliability.
    Keywords: Cloud Computing System (CCS); Virtual Machines (VM); Monte Carlo Simulation (MCS); Neural Networks; Annual Failure Rate (AFR); Self-Monitoring And Reporting Technology (SMART).

  • Efficient Multi-Level Cloud based Agriculture Storage Management System   Order a copy of this article
    by Kuldeep Sambrekar, Vijay Rajpurohit 
    Abstract: Attaining good agriculture productivity aid countries Gross Domestic Product (GDP) growth. Guarantying food security across globe possesses huge challenges due global warming resulting unpredictable weather and shrinking natural resources. As a result, use of Data Analytic (DA), and Internet of Things (IoT) has been employed by various agencies such as remote sensing forecasting and GIS Technology to build efficient agriculture management system. Cloud computing platform has been adopted for storing and accessing these data remotely. However, it incurs cost overhead for storing and assessing large data. Multi-cloud platform is adopted, however these models are not efficient as it incurs latency and does not provision fault tolerance guarantee. For overcoming these research challenges, this work presents Efficient Multi-Level Cloud based Agriculture Storage Management System (EMLC-ASMS). The outcome shows EMLC-ASMS attain significant performance over existing model in terms of computation cost, and latency.
    Keywords: cloud based agricultural storage management;multi-level cloud storage;cloud storage optimization;multi-cloud storage;efficient hierarchical cloud based storage mechanism.

  • Towards an Efficient and Secure Computation over Outsourced Encrypted Data using Distributed Hash Table   Order a copy of this article
    by Raziqa Masood, Nitin Pandey, Q.P. Rana 
    Abstract: On-demand access to outsourced data from anywhere has diverted data owners' mind to store their data on the cloud servers instead of standalone devices. Security, privacy, and availability of data are still the major concerns that need to be addressed. A quick overcome for the users from these issues is to encrypt their data with their keys before uploading it to the cloud. However, computing over encrypted data still remains to be highly inefficient and impractical. In this paper, we propose an efficient and secure data outsourcing with the distribution of servers using a distributed hash table mechanism. It helps to compute over the data from multiple owners encrypted using different keys, without leaking the privacy. We observe that our proposed solution has less computation and communication cost from other existing mechanisms while is free from a single point of failure.
    Keywords: Distributed Hash Table; Data Outsourcing; Peer-Proxy Re-encryption; Privacy; Security.

  • A Correlation based Investigation of VM Consolidation for Cloud Computing   Order a copy of this article
    by Nagma Khattar, Jaiteg Singh, Jagpreet Sidhu 
    Abstract: Virtual machine consolidation is of utmost importance in maintaining energy efficient cloud data centers. Tremendous amount of work is listed in literature for various phases of virtual machine consolidation (host underload detection, host overload detection, virtual machine selection and virtual machine placement). Benchmark algorithms proposed by pioneer researchers always cater as a base to develop other optimised algorithms. It seems essential to understand the behaviour of these algorithms for VM consolidation. There is a lack of analysis on these base techniques which otherwise can lead to more computationally intensive and multidimensional solution. The requirement to crucially investigate behaviour of these algorithms under various tunings, parameters and workloads is the need of the hour. This paper addresses the gap in literature and analyses the characteristics of these algorithms in depth under various scenarios (workloads, parameters) to find the behavioural patterns of algorithms. This analysis also helps in identifying strength of relationship and correlation among parameters. Future research strategy to target the VM consolidation in cloud computing is also proposed.
    Keywords: VM consolidation; host underload detection; host overload detection; virtual machine selection; virtual machine placement; cloud computing.

  • A Case Study On Major Cloud Platforms Digital Forensics Readiness - Are We There Yet?   Order a copy of this article
    by AMEER PICHAN, Mihai Lazarescu, Sie Teng Soh 
    Abstract: Digital forensics is a post crime activity, carried out to identify the culprit responsible for the crime. The forensic activity requires the crime evidence that are typically found in a log that stores the events. Therefore, the logs detailing user activities are a valuable and critical source of information for digital forensics in the cloud computing environment. Cloud service providers (CSPs) usually provide logging services which records the activities and events with varying level of details. In this work, we present a detailed and methodological study of the logging services provided by three major CSPs, i.e., Amazon Web Services, Microsoft Azure and Google Cloud Platform, to elicit their forensic compliance. Our work aims to measure the forensic readiness of the three cloud platforms using their prime log services. More specifically, this paper (i) proposes a systematic approach that specifies the cloud forensic requirements; (ii) uses a generic case study of crime incident to evaluate the digital forensic readiness. It shows how to extract the crime evidence from the logs and validate them against a set of forensic requirements; (iii) identifies and quantifies the gaps which the CSPs failed to satisfy.
    Keywords: Cloud Computing; Cloud forensics; Cloud log; Evidence; Forensic artifacts; Digital investigation; Digital forensics.

Special Issue on: IEEE SERVICES 2018 Edges, Fogs and Clouds as Engines of IoT

  • Edge-centric Resource Allocation for Heterogenous IoT Applications using a CoAP-based Broker   Order a copy of this article
    by Simone Bolettieri, Raffaele Bruno 
    Abstract: The Edge/Fog computing paradigm has been recently advocated for future IoT systems to cope with the capacity and latency constraints of conventional cloud-centric IoT architectures. Fog nodes are not only needed to offload computing tasks from the centralised cloud but also to provide IoT applications with management services that facilitate deployment and improve Quality of Service. Indeed, in large-scale IoT deployments, it is expected that a large number of applications access the same resources (e.g. a sensor or an actuator), most likely hosted on constrained devices. Moreover, applications can have highly heterogeneous QoS requirements, e.g., regarding real-time constraints or frequency with which they desire to receive notifications from the monitored resources. However, IoT applications may be unable to autonomously adapt their access patterns for IoT resources to network dynamics and bandwidth limitations. To address these issues, in this work we design a fog-based broker that regulates the access to IoT resources transparently and effectively. Specifically, we develop an optimisation framework to determine the notification periods that maximise the applications' QoS satisfaction under network-related constraints. Then, we also propose practical algorithms that leverage measurements of the degree of reliability of application transmissions to infer the congestion level of the IoT resources, and adapt the notification periods accordingly. We have developed a software prototype of our broker by exploiting the standard features of the CoAP protocol. Then, we have validated the proposed solution through simulations and real experiments in an IoT testbed. Results show that, as the application demands increase, the proposed approach guarantees better QoS satisfaction, higher throughput and improved energy efficiency than a conventional CoAP proxy. Moreover, the efficacy of the optimal solution heavily depends on accurate estimates of the network capacity, which may be difficult to obtain in real-world IoT deployments.
    Keywords: Internet of Things; fog computing; resource brokering; CoAP; emulation; prototype.
    DOI: 10.1504/IJCC.2020.10026716
  • SMIoT: A Software Architecture for Maintainable Internet-of-Things Applications   Order a copy of this article
    by Ilse Bohé, Michiel Willocx, Vincent Naessens 
    Abstract: Developing sustainable IoT applications is by far no sinecure. Many IoT apps that are currently on the market can only be coupled with a fixed set of edge devices or technologies. On the contrary, IoT sensors and actuators evolve at a fast pace degrading the attractiveness of many IoT applications over time. The vendor lock-in trap is often triggered by sensor-centric application development. This paper presents application-centric development as an alternative approach to tackle maintainability problems in IoT ecosystems. The paradigm shift is supported by a layered architecture called SMIoT, which guides the design of advanced IoT ecosystems. Applying the architectural principles result in IoT apps that can easily cope with new technologies that come to the market, hence, increasing their lifetime and offering various infrastructural alternatives to end-users. Furthermore, the architectural principles are adopted in an Android framework implementation and validated through the design of a health care ecosystem.
    Keywords: Internet-of-Things; Software architecture; Application development; Maintainable;Sustainable; Application-centric; Framework; Android; Edge devices; Tackling Vendor lock-in;.

  • The Edge Architecture for Semi-Autonomous Industrial Robotic Inspection Systems   Order a copy of this article
    by Ching-Ling Huang 
    Abstract: Robots have been increasingly used in industrial applications, being deployed along other robots and human supervisors in the automation of complex tasks such as the inspection, monitoring and maintenance of industrial assets. In this paper, we shared our experience and presented our implemented software framework for such Edge computing for semi-autonomous robotics inspection. These systems combine human-in-the-loop, semi-autonomous robots, Edge computing and Cloud services to achieve the automation of complex industrial tasks. This paper describes a robotic platform developed, discussing the key architectural aspects of a semi-autonomous robotics system employed in two industrial inspection case studies: remote methane detection in oilfields, and flare stack inspections in oil and gas production environment. We outline the requirements for the system, sharing the experience of our design and implementation trade-offs. In particular, the synergy among the semi-autonomous robots, human supervisors, model-based edge controls, and the cloud services is designed to achieve the responsive onsite monitoring and to cope with the limited connectivity, bandwidth and processing constraints in typical industrial setting.
    Keywords: Semi-autonomous robotics; remote methane leak inspection; Unmanned Aerial Vehicle (UAV); HMI (Human Machine Interface).

  • In-network Processing for Edge Computing with InLocus   Order a copy of this article
    by Lucas Brasilino 
    Abstract: As sensors and smart device infrastructure grows, networks are\r\nincreasingly heterogeneous and diverse. We propose an efficient and low-latency\r\narchitecture called InLocus, which facilitates stream processing at the networks\r\nedge. InLocus balances hardware-accelerated performance with the flexibility of\r\nasynchronous software control.\r\nIn this paper we extented InLocus architecture by implementing compute\r\nnodes in a a more traditional cloud-based solution in the form of Apache Kafka\r\nand Twitter Heron framework, as well as by introducing a new runtime approach\r\nfor the previously handwritten C Server. We utilize a flexible platform (Xilinx\r\nZynq SoC) to compare microbenchmarks between the latter and High-Level\r\nSynthesis (HLS) version in programmable hardware.
    Keywords: In-network Processing; Edge Computing; Internet of Things,\r\nProgrammable Logic; FPGA; Offloading; Hardware Acceleration.

Special Issue on: ICBDSDE'19 Cloud Computing for Smart Digital Environment

  • Developing a Smart Learning Environment for the Implementation of an Adaptive Connectivist MOOC Platform   Order a copy of this article
    by Soumaya EL EMRANI, Ali EL MERZOUQI, Mohamed KHALDI 
    Abstract: A pedagogical object can refer to any pedagogical component that can be used in the learning process. It could be a text, an image, a video, a web page, etc. Personalizing the pedagogical content can be considered crucial. So, this can declare the need to find collaboration agreements between the pedagogical contents specialists, in order to get the collaborative development or the pedagogical content reuse. Consequently, E-learning standards and specifications give the solution, with possibilities of reuse, interoperability and customization.rnSince the main goal of our research is providing an adaptive cMOOC, this requires to adjusting the pedagogical content to each learner profile. So, an adaptive learning design has to present different learning strategies based on process of data analytics that include previous and current experiences, learning styles and learner profile.rnAs a part of our implementation, this structural design can be made by using some machine learning algorithms in parallel of the IMS standard and related specification.rn
    Keywords: Pedagogical Object; MOOC; cMOOC; Adaptive cMOOC; Machine Learning; Intelligent Platform; Pedagogical Content; IMS.

  • An Improved Pricing Algorithm for Infrastructure as a Service Clouds   Order a copy of this article
    by Seyyed-Mohammad Javadi-Moghaddam, Asieh Andarzgoo, Mohsen Saberi 
    Abstract: Marketing in cloud systems enables users to trade and share resources. For the sales of services, Client applications and service providers negotiate to make a Service Level Agreement. Offering prices in the negotiation of services become one that is challenging. A federal cloud is an efficient approach of recent interest to better balance risk sharing between services provider and customer. This work presents a new algorithm to increase service provider revenue and reducing user costs simultaneously. The auction of remaining time spent on resources and interactions between federal clouds increases the profits of clouds, the number of successful requests, and reduces users' costs. The simulation results confirm the expectations of the proposed approach.
    Keywords: Federal cloud; Pricing model; Service quality; Service level agreement.

  • Versioning Schemas of JSON-based Conventional and Temporal Big Data through High-level Operations in the TJSchema Framework   Order a copy of this article
    by Zouhaier Brahmia, Safa Brahmia, Fabio Grandi, Rafik Bouaziz 
    Abstract: ?JSchema is a framework for managing time-varying JSON-based Big Data, in temporal JSON NoSQL databases, through the use of a temporal JSON schema. This latter ties together a conventional JSON schema, which is a standard JSON Schema file, and its corresponding temporal logical and temporal physical characteristics, which are stored in a temporal characteristic document. Conventional JSON schema and temporal characteristics could evolve over time to satisfy new requirements of the NoSQL database administrator (NSDBA) or to comply with changes in the modelled reality. Accordingly, the corresponding temporal JSON schema is also evolving over time. In our previous work (Brahmia et al., 2017, 2018b, 2019a), we have proposed low-level operations for changing such schema components. However, these operations are not NSDBA-friendly as they are too primitive. In this paper, we deal with operations that help NSDBAs to maintain these schema components, in a more user-friendly and compact way. In fact, we propose three sets of high-level operations for changing the temporal JSON schema, the conventional JSON schema, and the temporal characteristics. These high-level operations are based on our previously proposed low-level operations. They are also consistency-preserving and more helpful than the low-level ones. To improve the readability of their definitions, we have divided these new operations into two classes: basic high-level operations, which cannot be defined through other basic high-level operations, and complex ones.
    Keywords: Big Data; NoSQL; JSON; JSON Schema; TJSchema; Conventional JSON schema; Temporal JSON schema; Temporal logical characteristic; Temporal physical characteristic; Schema change operation; Schema versioning; temporal databases.

  • Versioning Temporal Characteristics of JSON-based Big Data via the ?JSchema Framework   Order a copy of this article
    by Safa Brahmia, Zouhaier Brahmia, Fabio Grandi, Rafik Bouaziz 
    Abstract: Several modern applications, which exploit Big Data (e.g., Internet of Things and Smart Cities), require the analysis of a complete history of the changes performed on these data which may also include modification to their schemas (or structures). Although schema versioning has long been advocated to be the best solution to cope with this issue, there are no currently available technical solutions, provided by existing Big Data management systems (especially NoSQL DBMSs), for handling temporal evolution and versioning aspects of Big Data. In (Brahmia et al., 2016), for a disciplined and systematic approach to the temporal management of JSON-based Big Data in NoSQL databases, we have proposed the use of a framework, named ?JSchema (temporal JSON Schema). It allows the definition and validation of temporal JSON documents that conform to a temporal JSON schema. A ?JSchema schema is composed of a conventional (i.e., non-temporal) JSON schema annotated with a set of temporal logical and temporal physical characteristics. Moreover, since these two components could evolve over time to respond to new applications requirements, we have extended ?JSchema, in (Brahmia et al., 2017), to support versioning of conventional JSON schemas. In this work, we complete the picture by extending our framework to also support versioning of temporal logical and physical characteristics. In fact, we propose a technique for temporal characteristics versioning, and provide a complete set of low-level change operations for the maintenance of these characteristics; for each operation, we define its arguments and its operational semantics. Thus, with the proposed extension, ?JSchema will provide a full support of temporal versioning of JSON-based Big Data at both instance and schema levels.
    Keywords: Big Data; NoSQL; JSON; JSON Schema; ?JSchema; Conventional JSON schema; Temporal JSON schema; Temporal logical characteristic; Temporal physical characteristic; Schema change; Schema versioning.

  • Analysing Knowledge in Social Big Data   Order a copy of this article
    by Lejdel Brahim 
    Abstract: Big data has become an important issue for a large number of research areas such as data mining, machine learning, computational intelligence, the semantic Web, and social networks. The combination of big data technologies and traditional machine learning algorithms has generated new and interesting challenges in other areas as social media and social networks. These new challenges are focused mainly on problems such as data processing, data storage, data representation and visualizing data. In this paper, we will present a new approach that can extract entities and their relationships from social big data, allowing for the inference of new meaningful knowledge. This approach is a hybrid approach of multi-agent systems and K-means algorithm.
    Keywords: K-means; Multi-Agent Systems; Big data; data mining; social networks.

Special Issue on: Big Data Computing and Sustainable Cloud Communication Systems

  • An IoT based Secure Data Transmission in WBSN   Order a copy of this article
    by I. Karthiga, Sharmila Sankar 
    Abstract: Internet of Things (IoT) is deemed as the new age technology that is anticipated to be a blessing for human life. The basic conception of the IoT is to connect or give complete access to the Internet. Transmitting the information via the IoT with the maximal security is a vital process. Cryptography is one of the domains of Network Security, which is one such mechanism that helps the data transmission process to be secure enough over the wireless or wired channel and along with that, it gives confidentiality, authenticity, integrity of data and prevents repudiation. Here, a framework is developed for the secure transmission of data from the Wireless Body Sensor Network (WBSN) in the IoT environment. Initially, the sensors in the human body gather the data as of its signals. This data is then directed to the cloud through a gateway. The 3 stages which are to be implemented are a) Authentication, b) Security and c) Load Balance. In authentication, the 3 steps executed are i) registration, ii) login and iii) verification. The data is securely transmitted using the Optimized Elliptical Curve Cryptography (OECC). To allow multiple users to use the network, load balancing (LB) is done. This is executed using the KHLB (Krill Herd Load Balancing). The proposed OECC and KHLBs performance is analyzed by comparing them with existing methodologies regarding the performance measures say encryption time, decryption time, security level, packet loss ratio (PLR), end to end delay, along with throughput. The proposed encryption along with LB scheme achieves the best performance contrasted to other existing techniques regarding all performance measures.
    Keywords: Internet of Things; Wireless Body Sensor Networks; Optimized Elliptical Curve Cryptography; Krill Herd Load Balancing Algorithm; Authentication; Security.

  • Hybrid PSO-DE Algorithm Based Trust and Congestion Aware Cluster Routing Algorithm for MANET   Order a copy of this article
    by A. EMMANUEL P.E.O. MARIADAS, R. Madhanmohan  
    Abstract: Trust awareness, Mobility, link lifetime and efficiency in terms of energy were the problems linked with the Mobile Ad hoc Networks (MANET), the nodes can move un predictable depends upon the direction with the limited battery life results in better frequent changes in topology. Hence the attackers becomes the best for the packet delivery must be sufficient for the inside the router of the network. Some of the basic constrains were widely studied with an increases for the security lifetime for the network. The paper concentrate on the problem such as the lifetime, mobility defined for the efficiency in energy for the development of clustering based algorithm using the fitness value incorporated with the Hybrid PSO and differential algorithm for evaluation. Here the algorithm concentrate on the election of cluster head concentrate on the trust, life time , mobility, buffer and the energy remaining for the degree for the connectivity in the selection of nodes with the server of cluster head during the Hybrid PSODE for clustering algorithm. The proposed work may be extremely being intensively for the NS-3 network simulator, which incorporates with the other existing algorithms. Some results may be effectiveness for the proposed algorithm might be in terms of network life time, average life time, number of average clusters formation, average number of reclustering required with the energy consumption and the other parameter such as the packet delivery ratio.
    Keywords: Cluster Head; Fitness value; Energy efficiency; Trust; Life time; Mobility.

  • Kernel interpolation-based technique for privacy protection of pluggable data in cloud computing   Order a copy of this article
    by Manoj L. Bangare, Sarang A Joshi 
    Abstract: Recent technological revolution has been reported as the revolution of the cloud computing technology. The explosive availability of the unstructured data in the cloud has gained the attention of the researchers and the users store their data in the cloud without any right over controlling the data, causing the privacy concerns. Therefore, there is a need for the effective privacy protection techniques that assure the privacy of the user data in the cloud. Accordingly, this paper proposes a Kernel interpolation-based technique for preserving the privacy of the data in the cloud. Privacy and accuracy are the two factors assuring the privacy for the data, which are afforded using the proposed technique that uses the proposed Rider-Cat Swarm Optimization (R-CSO) algorithm for computing the Kernel interpolation coefficient, which is associated with affording the privacy in the cloud. The proposed Rider-Cat Swarm Optimization (R-CSO) algorithm is the integration of the standard Cat Swarm Optimization (CSO) in the standard Rider Optimization Algorithm (ROA). The analysis of the methods using the dataset from UCI machine learning repository reveals that the proposed method acquired the maximal accuracy and minimal database difference ratio (DBDR) of 80.552 % and 15.58%, respectively.
    Keywords: Cloud computing; Pluggable Data; privacy preservation; Interpolation; Optimization; Rider-Cat Swarm Optimization; DBWorld; Kernel interpolation-based; Termination; Rider Optimization Algorithm; Ciphertext-policy; R-CSO algorithm; Block diagram; Hash-Solomon code algorithm,.

  • Feature vector extraction and optimization for multimodal biometrics employing face, ear and gait utilizing artificial neural networks   Order a copy of this article
    by Haider Mehraj, Ajaz Hussain Mir 
    Abstract: Cloud Computing is the rapidly growing model for providing resources to users over internet. Numerous business establishments have adopted cloud computing environment as it has low upfront costs, is scalable and provides rapid deployment. Many consumers store their sensitive information on the cloud and as such there needs to be strong authentication mechanism in place so that only authorised users are able to access the cloud. Multimodal biometrics is an upcoming research area to explore for improving the security of cloud. In this work, a novel multimodal biometric fusion system using three different biometric modalities including face, ear, and gait, based on Speed-Up-Robust-Feature (SURF) descriptor along with Genetic Algorithm (GA) for enhanced cloud security is anticipated. Artificial neural network (ANN) is utilized as a classifier for each biometric modality. Our novel fusion process has been effectively tested by means of dissimilar images analogous to all subjects from three databases including AMI Ear Database, Georgia Tech Face Database along with the CASIA Gait Database. Because of these biometric traits, the anticipated method requires no significant user assistance and also can work from a long distance. Before going for the fusion, the SURF features are optimized using genetic algorithm and cross validated using artificial neural network. The evaluations are done on a publicly available database demonstrating the accuracy of the proposed system compared with the existing systems. It is observed that, the amalgamation of face, ear and gait gives better performance in terms of accuracy, precision and recall and fmeasure.
    Keywords: Cloud Computing; Biometric Fusion; Feature Vector; SURF; GA; ANN; precision; recall; kappa; accuracy; Fmeasure.

  • Green Factors of Referential Value based Software Component Repository   Order a copy of this article
    by Pradeep Kumar, Shailendra Narayan Singh, Sudhir Dawra 
    Abstract: Reference based green computing has different dimensions like time, cost, space etc. Time and space required by software system is dependent upon development process of the software and algorithm used during software development. Green parameters of the software are directly proportional to the space and time required to process the softwares. In the referential software development system, new dimensions are coming into picture such as length of networks used, technology adopted in the network systems and response time of the middleware. In this paper, all green parameters including new dimensions proposed in the referential software development process have been calculated. The proposed technique presents the mapping of addresses required to simulate referential software‟s in the cloud computing scenario
    Keywords: Referential software; Cloud computing; Green computing; Computer networks.

  • A Novel Approach for Merging Ontologies using Formal Concept Analysis   Order a copy of this article
    by Priya. Munusamy, Ch. Aswani Kumar 
    Abstract: : Ontologies are mainly used for knowledge sharing and also as a knowledge structure. Due to the rising nature of ontologies, the method of merging information in the corporate realm turns to be critical. In the existing methods, formal concept analysis does not provide an efficient pseudo-intent calculation and does not handle large context. The proposed technique focused the issue of ontology heterogeneity that blocks the ontology interoperability and proposed a novel algorithm called Advanced Formal Concept Analysis merge. The AFCA-Merge algorithm performs four phases to merge the given two ontologies. In the first phase, it obtains the perfect attribute for the matching object using decision tree and pseudo intent technique. In the second phase, the obtained results are warehoused in the linked list as a formal context. In the third phase, the perfect relationship among formal contexts from the linked list has been identified using backtracking techniques. Finally, the merging phase performs the merging between the identified relations. The experimental outcome shows that the AFCA-Merge provides 97% of precision, 82 % of recall and 89 % of accuracy which is better than the existing technique.
    Keywords: AFCA-Merge; Formal Concept Analysis; Formal Context; Ontology Merging.

  • Spectral and Spatial Features Based HSI Classification Using Multiple Neuron Based Learning Approach   Order a copy of this article
    by Venkatesan Rudhrakoti, Prabu Sevugan 
    Abstract: With the improvement of remote sensing application, hyper spectral images have been used in large number of applications. And lot of works have been done to extract the features from remote sensing and accurate learning for classify the classes. The spectral and spatial data of images have been allows to classify the results with improved accuracy. Fusion of spatial and spectral data is an actual way in improv-ing the accuracy of hyper-spectral image classification. In this work, we proposed spectral with spatial details based on hyper-spectral image classification method using neural network classifiers and using multi neurons based learning approach is used to classify the remote sensing images with specific class labels. The features may be supernatural and latitudinal data is extracted using boundary values using decision boundary feature extraction (DBFE). These extracted features are trained using convolutional neural networks (CNN) for improve the ac-curacy for labeling the classes. The methodology entails of training with adding regularizer towards the loss function recycled for train the neural networks. Guidance is done using various layers with additional balancing constraints to evade falling into local minima. In testing phase, classify each remote sensing image and avoid false truth map. Experimental results shows that improved accuracy in class specifica-tion rather than other state of art algorithms.
    Keywords: Hyper spectral imaging; Classification; Features extraction; Neural networks; Class labels.

  • Correlative Study and Analysis for Hidden Patterns in Text Analytics Unstructured Data using Supervised and Unsupervised Learning techniques   Order a copy of this article
    by E.Laxmi Lydia, S. Kannan, S.Suman Rajest 
    Abstract: Two-third of the data generated by the internet is unstructured text in thernform of Emails, audio, video, pdf files, word documents, text documents. Extraction ofrnthese unstructured text patterns using mining techniques achieve quick access tornoutcomes. Textual data available atonline contains different patterns and when thosernhuge incoming unstructured data enters into the system creates a problem whilernorganizing those documents into meaningful groups. This paper discusses documentrnclassification using supervised learning by focusing on the concept based algorithm andrnalso deals with the hidden patterns in the documents using unsupervised clusteringrntechnique and Topic-based Modeling for the analysis and improvement of systematicrnarrangement of documents by applying k-means and LDA algorithm. Finally, thisrnpresents the comparative study and importance of clustering than classification forrnunstructured documents.
    Keywords: Text Analytics; Concept Based method; Data Representation and Storage,rnLatent Dirichlet Allocation(LDA)Algorithm.

    by Sathish Kumar Ravichandran 
    Abstract: Effective organization of a warehouse's incoming goods section is important for its productivity as ensuring efficient shelving systems. When the incoming goods section is not properly configured, this almost automatically causes major interruptions throughout the subsequent storage phase. For effective storage of goods in warehouse Farm Optimization Algorithm (FOA) is proposed. The efficacy of the proposed approach was demonstrated using BR data sets and it is compared with different optimization algorithms. From this experiment, it is noted that the suggested FOA fulfills the objective of efficient arrangement of goods in the warehouse. The order in which the goods are placed into the warehouse is also noted to be ideal than other competitive optimization algorithms.
    Keywords: Effective organization; Warehouses; Farm Optimization algorithm; Efficient arrangement; Optimization algorithm.

  • Hybrid Swarm Intelligence for Feature Selection on IoT based Infrastructure   Order a copy of this article
    by Nallakaruppan Kailasanathan, Senthilkumaran Ulaganathan 
    Abstract: Swarm Intelligence techniques are deployed to estimate the fitness on the search spaces and estimates the optimization. Since the evolution of the Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) optimization problems and complex real-world problems were solved with ease. There is a need to enhance the performance of optimization and exploration of the search spaces with the contributions of esteemed Seyedali Mirjalili. He invented the moth-flame optimization. This algorithm provided best solution as the iterations increase. The fittest Moth-Flame combinations were made up and best positions of the flames reduced in every iteration and in the final iteration provided the best Moth-Flame combination. There is a conce for local-minima for Moth- Flame optimization and convergence rate of Moth-flame is more it may skip the global optimal search. The combination of the Simulated Annealing (SA) and the Moth-Flame Optimization (MFO) provides a solution to local minima, increases the diversity of the population and increases the exploration, reduces the convergence rate to increase the performance of MFO to reach global optima and at last increases the performance of MFO. This is the first attempt of this hybrid swarm intelligent on IoT (Inte et Of Things) databases and through which we select the features (attributes) that impact on the decision-making process of the IoT.
    Keywords: IoT (Internet of Things); Moth-Flame Optimization (MFO),rnSimulated Annealing (SA); KNN Classification (K-Nearest NeighbourrnClassification); Genetic Algorithm (GA); Particle Swarm Optimization (PSO).

  • Development of a network packet sniffing tool for Internet Protocol generations   Order a copy of this article
    by Ruwaidah F. Albadri 
    Abstract: Packet sniffing is a way to take advantage of each packet as it flows across the network. One of the most complex problems that face the network administrators is the network analyzing. The information provided by existing tools for network traffic analysis is very small. However, they are considered enormous data if they were all stored for later analysis which make it difficult to be analyzed or even stored. This paper aims to propose a sniffing tool to capture the packets of both IPv4 and IPv6. The captured packets can be stored temporarily in order to get some statistics about the network like the port that is most frequent used for both source and destination. A sniffing tool is proposed to capture and analyze the network in simplest way with no wasting storage. The proposed tool works to access the captured packets and get the information from it by using Socket class in Visual Studio. The captured information is analyzed directly and illustrated in graphs and tables to give the user all the information in a simple way. There are two scenarios are used in this paper in order to validate the ability of this tool. The first scenario is to analyze IPV4 by capturing the packets and identifying the used ports, protocols, and the packets. However, the second scenario is to analyze IPV6 in the same way. The results indicate the ease of use of the tool through the presence of user-friendly graphical interface and simplicity in terms of buttons and menus. The results are varied from IPv4 to IPv6 by the number of captured protocols and the used ports for both source and destination.
    Keywords: Network analyzer; Network sniffer; sniffing tool; network packets capturing.

  • A Methodology For Evaluating The Transparency Of eGovernment Websites   Order a copy of this article
    by Ishaku Liti Awalu, Hung Kook Park 
    Abstract: Over the years, ICTs have transformed governance, with websites increasingly being the focal point of government information and service delivery to its citizens. Recently, several evaluation methodologies as well as standards such as usability, accessibility, credibility, and functionality have been developed. Lately, due to increased demand by stakeholders for open data and open information, Transparency has become one of the standards for evaluation of government information and service openness. Currently, government wide Transparency is measured by organizations such as Open Government Partnership OGP and EU. Unfortunately, despite a few conceptual frameworks proposed by some researchers, there is no thorough research into practical and actual evaluation of Transparency of government websites. Consequently, this paper proposes a methodology for the evaluation of a government website Transparency. The methodology researched available literature to identify attributes of website Transparency. The attributes were analyzed for redundancy and subsequently grouped into four major categories/indicators using deductive reasoning. The four major categories/indicators are information, reliability, reachability and accountability documents. Information is sub categorized into website content and the way the content is presented or rendered on the website. Reliability is made up of attributes such as privacy policy, spelling error, branding etc. Reachability consists of contact details of key officials of the organization as well as that of the organization itself. Accountability documents are reports and statistics, budget or procurement documents, freedom of information report, annual reports etc. Finally, the paper highlighted how the methodology can be applied in the evaluation and ranking of government websites based on Transparency.
    Keywords: Transparency; Website Evaluation; Web Engineering.

  • OREA for improving data packet transmission in Wireless Sensor Networks with Cloud Security Mechanism   Order a copy of this article
    by Senthil Kumar, Thirukrishna J T 
    Abstract: Wireless Sensor Networks (WSNs) are often used for observing physical world applications which performs the effective automation process. Sensor Networks contain numerous nodes that can sense and gather statistical data. Data gathering become obvious by sensor nodes over the sensor deployed environment. These sensor nodes function by the power source, i.e. battery. The battery has been fixed in the sensor nodes. So it is difficult to replace or remove the battery from the sensor nodes. One of the prime key design issues in the Wireless Sensor Networks is power consumption i.e energy. When the sensed data is transmitted to the sink then sensor nodes consumes the energy from battery. Since nodes are functioning by this battery power. The proposed algorithm of Optimized Radio Energy Algorithm (OREA) provides efficient energy dissipation and data transmission to the sink is quite faster. The dimension of overall performance of a service in the WSNs is known as Quality of Service (QoS).The Quality of Service metrics traffic load and packet delivery ratio has been compared OREA with existing algorithms such as random and homogenous selection. OREA provides better QOS delivery and also prolonged battery life time in order to achieve the efficient usage of power. The simulation of MATLAB results manifested to attain the network life time has prolonged in comparison with existing algorithms and also provides cloud security mechanism.
    Keywords: Wireless Sensor Networks; IEEE 802.15.4; Quality of Service; Energy; Sensor Node; 6LoWPAN; Cloud Security.

  • Cluster-based Authentication for Machine Type Communication in LTE Network using Elliptic Curve Cryptography   Order a copy of this article
    by K. Krishna Jyothi, Shilpa Chaudhari 
    Abstract: Machine Type Communication (MTC) is a significant approach for communication in the Long Term Evaluation (LTE) networks. This paper intends to introduce a new cluster based authentication model for MTC in the LTE network. The presented framework includes the phases such as (i) Clustering (ii) Node registration, (iii) Optimal Cluster Head Selection (CHS), (iv) CH authentication and (v) MTC. At first, the nodes in LTE network are subjected under the clustering process via k-means approach and subsequently, the nodes do its registration as well. As each cluster needs their head, it is essential to select the optimal node for the same, which acts as the representative for all the remaining nodes in the cluster. However, the selection of optimal cluster head is a tedious process, and hence, this paper establishes a novel hybrid approach for selecting the CH. The adopted scheme is termed as Wolf Insisted Jaya Algorithm (WI-JA) that hybrid the concept of both Jaya Algorithm (JA) and Grey Wolf Optimization (GWO) algorithms. As a novelty, in this work, all the selected cluster head involves the records of all the remaining nodes exists in node, and this will be handled via Block Chain Technology. Finally, in the authentication phase, rather than authenticating each node in network, it is planned to authenticate the chosen cluster head since it substantiates the existing nodes in the network. Moreover, Elliptic Curve Cryptography (ECC) based CH authentication is done in this work. This proposed structure makes the roadmap for secured MTC. The performance of the presented framework was evaluated and proved over other conventional models with respect to certain measures.
    Keywords: MTC; Authentication; LTE network; Jaya Algorithm; Grey Wolf Algorithm.

  • Enhanced Modulation Scheme for Cognitive Radio Over Rayleigh Fading Channels Using Power Allocation And Spectrum Sensing Models   Order a copy of this article
    by B.Maheswara Rao, S. Baskar 
    Abstract: Majority of the spectrum occupancy measurement has been operated in the context of cognitive radio. Measurements of spectrum occupancy are enormous in outdoor elevated points such as building roofs, balconies and towers. Even though, these mentioned measurement scenarios provide better estimation of spectral activity of the primary transmitters, in practical situations where users are not located in that high point, they cannot be considered as representative of the spectrum occupancy recognized by a cognitive radio user. Over fading channels, spectrum sensing is considered as the most important operation of cognitive radios. Fading margin and count of relays within wireless communication link plays a key role in the sensing performance. Various sensing detectors are proposed in the literature, with an as-sumption that the primary user is either completely present or completely absent within the observation window. By using various modulation schemes for cognitive radio over Rayleigh fading channels, this paper aims to study the effect of the pri-mary user duty cycle on spectrum sensing performance.
    Keywords: Cognitive Radio; Rayleigh Fading; Spectrum Sensing; Signal Transmission; Fading Cycle,.

  • A Novel data sharing model for cloud environment using dual key authentication   Order a copy of this article
    by Gowtham Mamidisetti, Ramesh Makala 
    Abstract: In the present world scenario, cloud computing plays a significant role in the sharing of data between computing devices in a secure manner. Specifically, data sharing amongst dynamic or static groups has grown as a major factor. For this purpose, the study proposed a dual key authentication based dynamic group data sharing model in a cloud environment. First, data are encrypted before sharing using AES algorithm. Here, data is stored in the form of string instead of bytes which can reduce the memory size without affecting the original data. Second, the cryptographic approach is applied to overcome the issues of computational load and key length maintenance. A subkey is generated for the private key, wherein each user will receive a different subkey. Third, enabled the prevention of secret data from collusion attack and improved the malicious activity prediction during data transfer. So, the revoked user unable to violate data confidentiality. Also, the original data cannot be decrypted by anyone even if they receive it. Finally, this study evaluated the performance of the proposed cloud security model with the existing methods, especially PRE, CL-PRE, and SaDaSc.
    Keywords: Data Sharing; Encrypted Cloud Data; Cloud Services; Dynamic Group Data; Security; Access Control.

Special Issue on: CUDC - 2019 Emerging Research Trends in Engineering, Science and Technology

  • Comparative Analysis of different Polynomial Interpolations for implementing Key Management techniques in MANETs   Order a copy of this article
    by Chetna Monga, K.R. Ramkumar, Shaily Jain 
    Abstract: The backbreaking issue in Mobile Ad hoc NETworks (MANETs) is ensuring security which abounds due to dynamic nature and unavailability of centralized infrastructure. Due to the distributed nature of network, trading the complexity has been found so far as a natural remedy to ensure security. In order to secure MANETs, we inspect two polynomial interpolation approaches avowed as Lagranges interpolation and other as Curve fitting. The key shares are disseminated among some predefined fraction of nodes called Security Association Members (SAMs). In order to facilitate certificate management in a versatile stance, identity (ID) based method with polynomial based interpolation approach is used. The new node has to fit into the parameters set by these SAMs so as to acquire the required quantity of key shares. As the key shares are transferred through Error- Free and Error- Prone channels, so the assumptions are done likewise. The analysis represents the superiority of Curve fitting over Lagranges approach as the intricacy of generating polynomial in Lagranges approach is high than the Curve fitting. The result reveals the acute accuracy of Curve fitting approach along with less memory and time consumption with each order of polynomial.rnrn
    Keywords: MANETs; Key Management; Polynomial Interpolation; Lagrange Interpolation; Curve Fitting; Security; Accuracy; Memory consumption; Secret Key; Node-ID.

  • Software Defined Networking: A Crucial Approach for Cloud Computing Adoption   Order a copy of this article
    by Sumit Badotra, Surya Narayan Panda 
    Abstract: The most important convince which is contributed by the cloud is that it lets to deliver an infrastructure framework and various services rapidly instead of ordering, installing and then configuring a lot of servers, you can go for a particular number of virtual machines (VMs). Networking approach used in the cloud the network is becoming a hurdle to expand its scalability and therefore, it becomes one of the reasons that the network has become more complex and highest time-consuming part of executing the application. But with the help of introducing Software Defined Networking (SDN) approach into the networking, now the network infrastructure and its services can be configured through well-defined an Application Programming Interface (API), manageability of the cloud network is enhanced with the capability of increasing its scalability and therefore, the collaboration of Cloud and SDN is one of the hottest topics nowadays. This study aims to provide the importance of SDN in the cloud. In order to limit the hurdles in cloud infrastructure, especially in the large data, centers detailed study on its importance, architectural and advantages are stated. One of the newly emerged simulation tool (CloudSimSDN) with its detailed explanation for executing the experiments is also illustrated.
    Keywords: cloud computing; data centers; software defined networking; data plane; control plane; application programming interface.

  • Performance Comparison of Various Techniques for Automatic Licence Plate Recognition Systems   Order a copy of this article
    by Nitin Sharma, Pawan Kumar Dahiya, Baldev Raj Marwah 
    Abstract: Automatic licence plate recognition system is direly needed nowadays for various applications like toll collection system, parking system, identification of stolen cars, incident management, electronic payment service, electronic customs clearance of commercial vehicle, automatic security roadside inspection, security monitoring in a car, emergency notification, and personal security, etc. An automatic licence plate recognition system performs three important processing steps on the input image, i.e., extraction, segmentation, and recognition. A number of algorithms are developed for these steps since last few years. The result of which is significant improvement in the licence plate recognition. The aim of this study is a survey of the existing techniques for licence plate recognition. In this paper, a number of existing techniques for automatic licence plate recognition are presented and their benefits and limitations are discussed. Further, the paper also foresees the future scope in the area of automatic licence plate recognition system.
    Keywords: Automatic Licence Plate Recognition System (ALPR); Neural Network (NN); Optical Character Recognition (OCR); Support Vector Machine (SVM).

Special Issue on: Machine Learning and Artificial Intelligence for Computing and Networking in the Internet of Things

  • Translation of Code Mixed Language to Monolingual Languages using Rule Based Approach   Order a copy of this article
    by Shree Harsh, T.V. Prasad, G. Ramakrishna 
    Abstract: Computational Linguistics is an evolving area in Artificial Intelligence. The demand of language translation has significantly increased due to cross- lingual communication and information exchange. Bilingual code switching is habitually observed in bilingual community. Nowadays, much research is being done in machine translation (MT) from Indian languages to foreign languages, generally to English and vice versa. The core component of MT system is identification and translation of morphological inflections and PoS word ordering with respect to language structure. Indian languages are morphologically richer than English language and have multiple inflections during translation into English Language. This paper focuses on the analysis and translation of code mixed language, i.e., Hinglish into pure Hindi and pure English languages. The experiments based on the algorithms in the paper are able to translate code mixed sentences to pure Hindi with a maximum success rate of 91% and to pure English with a maximum success rate of 84%.
    Keywords: Code Mixing; Hinglish; Pure Language Translation; Hybrid Morphologyrn.

  • Enhancing the operations for Integrity check on Virtual Instance Forensic logs using Cuckoo Filter Trees   Order a copy of this article
    by Gayatri S P, Saurabh Shah, K.H. Wandra 
    Abstract: Logs play a vital role in the forensic domain. Logs are congregated by the cloud service providers or some third parties with the help of the cloud service provider. These logs can hold pieces of evidence for the crime committed using the resources of the cloud service provider. A user of the cloud who can be an oppugner can hire the virtual instances, launch an attack, commit a crime to delete all the contents and close them. In such a case, logs play a major role to trace such oppugner. Such logs which are stored in a centralized system can be a major drawback as they can be tampered easily by the oppugners with the help of employees of the service provider termed as the malicious insider or with the help of the forensic investigator during the investigation process. The tampering of logs is done to defend the oppugner who can bribe the malicious insider or the forensic investigator. To handle such issues the authors have recommended techniques which aid in validating the logs against tampering. The authors have developed the algorithms using the cuckoo filters which are developed in the recent past. The cuckoo filter trees assist in providing the integrity of logs to the court of law and thus making legal trails and thus prosecuting the oppugner in a fair manner.
    Keywords: cloud forensic; cuckoo filter; oppugner; log integrity; concealment; cuckoo filter tree; forensic log; Forensics Braced Cloud; virtual instance.

  • ICU Medical Alarm System using IOT   Order a copy of this article
    by Fahd Alharbi 
    Abstract: : Monitoring in the Intensive Care Units (ICU) is an essential task to patient health and safety. The monitoring systems provide physicians and nurses with the ability to intervene when there is a deterioration in patient's condition. The ICU monitoring system uses audio alarms to alert about critical conditions of the patient or when there is a medical device failure. Unfortunately, there are cases of failure to respond to medical alarms that endanger the patient safety and result in death. The main reasons for the lack of responding to the alarms are alarm fatigue and alarm masking. In this paper, these issues are investigated and we propose a monitoring system using Internet of Things (IOT) to continually report the ICU medical alarm to doctors, nurses and family.
    Keywords: ICU; safety; audio alarm; alarm masking; alarm fatigue; IOT.

  • An Integrated Principal Component and Reduced Multivariate Data Analysis Technique for detecting DDoS attacks in Big data federated clouds   Order a copy of this article
    by Sengathir Janakiraman 
    Abstract: The rapid development and wide application of cloud computing in the applications of Big data on clouds necessitates the process of handling massive data, since they distributed among the diversely located data center clouds. Thus the need for an efficient detection scheme that differentiates legitimate cloud traffic from illegitimate becomes indispensable. In this paper, An Integrated Principal Component and Reduced Multivariate Data Analysis (PCA-RMD) Technique was proposed for detecting DDoS attacks in Big data federated clouds. This proposed PCA-RMD initially reduces the dimension of feature characteristics extracted from the big data traffic information by minimizing the principal components based on the method of correlation. Further, the correlation method is utilized for discriminating traffic based on EAMCA (Enhanced and Adaptive and Multivariate Correlation Analysis) and Enhanced Mahalanobis distance (EMD). The proposed PCA-RMD Technique is predominant in classification accuracy, memory consumptions and CPU cost compared to the baseline approaches used for investigation.
    Keywords: Big data Federated Clouds;DDoS attacks; Multivariate Data Analysis; Principle Component Analysis; Enhanced Mahalanobis Distance.

  • Smart Scheduling on Cloud for Traffic Signal to Emergency Vehicle Using IoT   Order a copy of this article
    by J. MANNAR MANNAN, Karthick Myilvahanan J, Mohemmed Yousuf R, Sindhanai Selvan K, Parameswaran T 
    Abstract: Emergency transportation in the larger cities needs a special concentration. A single negligence would cause a severe traffic deadlocks. Providing a dedicated lane to the emergency vehicle in larger cities is not feasible. The existing semi-automated traffic control system is not feasible to handle the situation of emergency transportation on metropolitan cities. To address this limitation, Internet of Things (IoT) based adaptive traffic signal control system is proposed. In this proposed system, GPS enabled ambulance position indicator get controls the traffic signal dynamically based on the position of ambulance and traffic density by using IoT devices. The deployment of RFID on road side closer to traffic signal have used to measure the distance of the vehicle queue over the predetermined ambulance path. The signal on the predetermined ambulance path have turned green dynamically by the ambulance vehicle via IoT controlled traffic signal, based on the location of the ambulance received from the GPS. This dynamic scheduling of traffic signal for smooth bypassing of ambulance vehicle is accurately measured without any delay by switching a traffic signal timing, from fixed time duration into variable time duration until the vehicle bypass the signal. The simulation results of our proposed research performed better compared with the other existing methods and it is very suitable to smart cities for traffic management during emergency vehicle transportation.
    Keywords: IoT; Internet; Emergency Transport System; Smart City; RFID; GPS; Automation.

Special Issue on: Impact of Machine Learning in the Cloud Computing Revolution

  • Word Sense Disambiguation using Optimization Techniques   Order a copy of this article
    by Rajini Selvaraj, Vasuki A 
    Abstract: In the field of Computational Linguistics, Word Sense Disambiguation(WSD) is a problem of high significance which helps us to find the correct sense of a word or a sequence of words based on the given context. Word sense disambiguation is treated as a combinatorial optimization algorithm wherein the aim is to discover the set of senses which help to improve the semantic relatedness among the target words. Nature inspired algorithms are helpful to find optimal solutions in reduced time. They make use of collection of agents that interact with the surrounding environment in a coordinated manner. In this article, two such algorithms, namely, Cuckoo Search and Firefly algorithms, have been used for solving this problem and their performance have been compared with the D-Bees algorithm based on Bee Colony optimization algorithm. They have been evaluated using the standard SemEval 2016 task 11 data set for complex word identification. Experimental results show that Firefly algorithm is performing the best.
    Keywords: Word sense disambiguation; Cuckoo search; optimization; firefly; Bees algorithm; unsupervised.

  • Multi cloud based Secure Privacy Preservation of Hospital Data in Cloud Computing   Order a copy of this article
    by KanagaSubaRaja S, Sathya Arunachalam, Karthikeyan S, Janani T 
    Abstract: The growth of cloud computing has led to privacy concerns abundantly. Any organization/user sends all the information to the cloud service provider and so the organizations/users data security is a concern. Data privacy and security issues can be solved by establishing clear policies that enable authorized data access and security. User authentication is the primary basis for access control and so using cryptographic encryption mechanism like key policy attribute based encryption we can provide strong authentication to ensure that data can be viewed by only who have to access it. Followed by which a never compromised integrity mechanism like SHA-256 hash mechanism is used to ensure that data is not modified in transit. These hashes are concatenated in a way to form top hash by structuring in a merkle hash tree. They are used by the erasure code to find lost data during any of the crashes. To make it efficient and to find the data loss, third party auditors are installed to check and report any changes in any of the cloud storage. Data recovery is by means of retrieving the data from another cloud that has the replica of these data.
    Keywords: Multi cloud; Key policy attribute based encryption; MHT; erasure code; third party auditor.

  • Effective Data Management and Real Time Analytics in Internet of Things   Order a copy of this article
    by Jeba N, Rathi S 
    Abstract: Integrating various embedded devices and systems in our socio-economic living environment enables Internet of Things (IoT) for smart cities. The underlying IoT infrastructure of smart and connected cities would generate enormous amount of heterogeneous data that are either big in nature or fast and real time data streams that can be leveraged for safety and efficient living of the inhabitants. Real time analytics on data enable to extract useful information from the voluminous data and provide information to users for decision making and help in feedback mechanism. In this paper, the effective management of heterogeneous data and real time analytics on data are studied. Data management deals with collecting and storing useful information to reduce manual tasks. Therefore, data management techniques should be consistent, interoperable and ensure reusability and integrity. We have explained the various architectures that can be used to deploy IoT in neural networks and the various streamingrntechniques for real time analytics.
    Keywords: Real time analytics; data management; heterogeneous data; IoT.

  • An Efficient Document Clustering Using Hybridized Harmony Search K-Means Algorithm with Multi view Point   Order a copy of this article
    by Siamala Devi, S. Anto , Siddique Ibrahim S P 
    Abstract: Document clustering is the most needed process in the data mining field where the number of documents with different methodologies are scattered. The meaningful information can be extracted from the group of documents by grouping them effectively. There are various researches exists previously which concentrates on clustering the documents present in the real. In the previous work, document clustering is done by using the methodology called the Hybridized Harmony K Means search (HHKM) algorithm. In this, clustering is done by using the K-means algorithm and the centroids of clusters are found optimally by using the harmony search algorithm. Initially, Hybridization of K-Means and Harmony Search based on Concept based, Kernel and weighted feature based Clustering algorithm (CKW HHKM) is adopted to cluster the documents. The problem reside in this method is the poor accuracy while clustering the documents where the unrelated documents are grouped together. To overcome this problem, Multi view Point HHKM (MP HHKM) approach is introduced, in which clustering can be done accurately. In this work, multi point analysis is done based on the similarity measurement. The exploratory tests were directed on News group and TREC data set from which it is clear that the proposed technique MPHHKM outperforms the existing technique with better accuracy values.
    Keywords: Clustering; Harmony Search; Multi view Point; Optimal.

  • DNA Coding and RDH Scheme Hybrid Encryption Algorithm Using SVM   Order a copy of this article
    by SHIMA RAMESH MANIYATH, Thanikaiselvan V 
    Abstract: As the communication technology advanced rapidly in recent times, the need for confidential data communication also arose. Here, a computationally feasible encryption/decryption algorithm is proposed to secure data using DNA sequences. The principal objective of DNA algorithm is to reduce big image encryption time. In this algorithm, natural DNA sequences are used as main keys. The image in which secret data is hidden using Reversible Data Hiding (RDH) technique is encrypted twice before transmission. RDH is an information security technology which is extremely helpful in telemedicine. Authentication is necessary for images captured by robots. This can be used for authentication of data or the owner of data. This technique also enables us to embed Electronic Patient Records (EPR) data into medical image before transmission, which can be later recovered on transmission side. The images are divided block-wise before encryption, in the proposed scheme. Machine Learning helps us to design a Support Vector Machine (SVM), based on which a classification scheme is obtained to group encrypted and original images separately, and to recover original image from encrypted image.
    Keywords: Reversible Data Hiding; DNA; Image Encryption; Support Vector Machine; Feature Extraction.

  • Protection of Mental healthcare documents using sensitivity based Encryption   Order a copy of this article
    by Kalaiselvi Shanmughasundaram, Vanitha Veerasamy, Sumathi V P 
    Abstract: Data security breaches and medical identity theft are the growing concerns in current scenario. Adopting IT services provided under cloud based technologies again increases the security threats. Several cryptographic techniques exist to protect data where the selection of appropriate technique increases the security while reducing the processing cost. The proposed method analyses the textual medical documents for their content sensitiveness and determines the adoption of appropriate cryptographic techniques. As security remains top concern for cloud adoption the proposed sensitivity based encryption improves the security and encryption efficiency at a significant level. The experimentation reveals that about 4% of time complexity gets reduced in encryption.
    Keywords: Encryption; Efficiency; AES; Sensitivity data; Cloud computing.

  • Using Augmented Reality to Support Children With Dyslexia   Order a copy of this article
    by Majed Aborokbah 
    Abstract: This paper presents the use of interactive improved reality interface to assist and support children with dyslexia and it is one of the most common learning disabilities in the world. This is a literacy-based learning difficulty that mainly effect in reading, writing, speaking, short-term memory, spelling and etc. Many more people perhaps as many as 1520% of the population as a whole have some of the symptoms of dyslexia. This paper introduces case studies with different learning scenarios of Arabic language which have been designed based on Human Computer Interaction (HCI) principles so that meaningful virtual information is presented for dyslexic children in an interactive and compelling way. The smart phones are considered as being potentially valuable learning tools, this due to their portability, accessibility and pervasiveness. The blending of Technology and education is something that is growing rapidly and becomes most popular. Augmented Reality (AR) is recent example of a technology that has been combined into the educational field. This work aims to integrate mobile technology and AR method to improve the dyslexic children (DC) academic performance, concentration and short-term memory. The design process includes the following steps of identify the research problem and determines the requirements to overcome dyslexia problems, collect carefully the data from different sources and the collected data will be used to construct the target product based on the prototype methodology. As the output come, it will contribute in improving the learning and basic skills of children with dyslexia.
    Keywords: learning disabilities; learning tools; augmented reality;.

    by LALITHA THAMBIDURAI, SaravanaKumar R 
    Abstract: A sensor node is that the significant part of a wireless sensor network. Sensor nodes have various roles in a network includes identifying data storage, data processing and routing method. Cluster is an organizational element for wireless sensor networks. The powerful environment of this network is very essential for them to be broken down into clusters to make easier responsibilities such as communication. Cluster heads are the group head of a cluster have greater data rate match the alternative cluster member. rnThey frequently needed to associate activities within the cluster. These methods comprise but are not controlled to data aggregation and forming account of a cluster .Base station is at the upper level of organized wireless sensor network. It generates communication link among the sensor network and the end user.The data in a sensor network can be used for an enormous variety of applications. A detailed application is form use of network data over the internet retaining a personal digital assistant or desktop computer.rn This paper contributions mobility based reactive protocol named Mobility of Sink based Data Collection Protocol (MSDCP).This protocol sensor with great energy and maximum quantity information are picked as cluster heads that gather data from the common nodes between the clusters. This data is placed unless mobile sink comes within the transmission area of cluster heads and request for the gathered data. One time the request is received from cluster head and it forward data to the mobile sink.rn
    Keywords: WSN; MSDCP; Transmission Area; Cluster Head;.

  • Fuzzy-C means Segmentation of Lymphocytes for the Identification of the Differential Counting of WBC   Order a copy of this article
    by Duraiswamy Umamaheswari 
    Abstract: In the domain of histology, discovering the population of White Blood Cells (WBC) in blood smears helps to recognize the destructive diseases. Standard tests performed in hematopathological laboratories by human experts on the blood samples of precarious cases such as leukemia are time-consuming processes, less accurate and it totally depends upon the expertise of the technicians. In order to get the advantage of faster analysis time and perfect partitioning at clumps, an algorithm is proposed in this paper that automatically identifies the counting of lymphocytes present in peripheral blood smear images containing Acute Lymphoblastic Leukemia (ALL). That performs lymphocytes segmentation by Fuzzy C-Means clustering (FCM). Afterward, neighboring and touching cells in cell clumps are individuated by the Watershed Transform (WT), and then morphological operators are applied to bring out the cells into an appropriate format in accordance with feature extraction. The extracted features are thresholded to eliminate the regions other than lymphocytes. The algorithm ensures 98.52% of accuracy in counting lymphocytes by examining 80 blood smear image samples of the ALL-IDB1 dataset.
    Keywords: Fuzzy c-means; medical image processing; morphology; segmentation; watershed; WBC count; leukemia.

  • A New Venture to Image Encryption using Combined Chaotic System and Integer Wavelet Transforms   Order a copy of this article
    by Subashanthini S, Pounambal Muthukumar 
    Abstract: In this digital era, securing multimedia information is receiving its due concern apart from securing textual data. Securing the image by utilising integer wavelet transform is the chief curiosity of the proposed work. This research work is envisioned to explore the use of reversible Integer Wavelet Transforms (IWT) for designing robust image encryption algorithm. The proposed exploration comforts to seal the gap in the space in between image encryption and the existing robust IWT. Ten different IWT namely Haar, 5/3, 2/6, 9/7-M, 2/10, 5/11-C, 5/11 A, 6/14, 13/7-T, 13/7-C are used for the analysis. Four keys utilised for image scrambling and image diffusion are generated with the help of the proposed combined chaotic system. Image scrambling is performed only on the approximation coefficients to get full image scrambling and Bit XOR is used for image diffusion. This proposed method provides NPCR value as 99.6246%, UACI value as 33.5829, entropy value as 7.997 and very less correlation values. Simulation results prove that image encryption technique can be designed with various integer wavelet transforms.
    Keywords: IWT; Chaotic map; Image encryption; Bit XOR encryption; Image scrambling; Entropy.

  • Programming and Epic Based Digital Storytelling Using Scratch   Order a copy of this article
    by Yamunathangam D 
    Abstract: Storytelling is a powerful tool to impart traditional and cultural values to children. Traditional storytelling followed by our ancestors have reduced. Digital storytelling has emerged as the successor and the modern storytelling method follows similar strategies of classical storytelling. Digital storytelling has started its evolution in teaching and learning process and emerged as the best tool to engage teachers and their students. Middle school students use various digital storytelling environments to learn a programming language. In this paper, Epic Based Digital Storytelling(EBDS) pedagogy using scratch to learn a programming language is discussed. The various aspects of using EBDS in education are given in the paper.
    Keywords: Epic Based Digital Storytelling; pedagogy; Scratch; team based learning; Programming.

Special Issue on: IRICT 2019 Innovations in Cloud Computing Technologies

  • Ontology Building for Patient Bioinformatics of the Smart Card Domain: Implementation Using Owl   Order a copy of this article
    by Waseem Alromima, Ahmed Alahmadi 
    Abstract: Abstract: Smarting cards play a very important part in facilitating the bioinformatics information process. The current problem is integrating information, such as for the structure of similar information regarding analysis and services. Therefore, patient information services need to be modelled and re-engineered in the area of e-governmental information sharing and processing to deliver information appropriately according to the citizen and situation. Semantic web technology-based ontology has brought a promising solution to these engineering problems. In this study, the main purpose is to provide each patient with a smart card that will hold all their bioinformatics for their entire life based on the proposed domain ontology. It will be recognized and used in all organizations related to e-health. The other aims is for automatic process of important medical documents and its related organizations, such as pharmacies, hospitals and clinics. The smart card can draw up the history of the patient in terms of illnesses and/or treatments; thus, facilitating the future management of his/her medical file. The proposed ontology for e-health information offers ease in introducing new bioinformatics information for patients services without moving the structure of the former ontology. The ontology created with the knowledge-based editor tool Prot
    Keywords: Semantic Web; Domain ontology; Services; owl; Citizens; e-health; Patient.

  • Machine Learning Classifiers with Preprocessing Techniques for Rumor Detection on Social Media: An Empirical Study   Order a copy of this article
    by Mohammed Al-Sarem, Muna Al-Harby, Essa Abdullah Hezzam 
    Abstract: The rapid increase in popularity of social media helped the users to easily post and share information with others. However, due to uncontrolled nature of social media platforms, such as Twitter and Facebook, it becomes easy to post fake news and misleading information. The task of detecting such problem is known as rumor detection. This task requires data analytics tools due to the massive amount of shared content and the rapid speed at which it is generated. In this work, the authors aimed to study the impact of different text preprocessing techniques on the performance of classifiers when performing rumor detection. The experiments performed on a dataset of tweets on emerging breaking news stories covered several events of Saudi political context (EBNS-SPC). The results have shown that preprocessing techniques have a significant impact on increasing the performance of machine learning methods such as support vector machine (SVM), Multinomial Na
    Keywords: Rumor Detection; Saudi Arabian News; Multinomial Naïve Bayes; Support Vector Machine; K-nearest Neighbor; Twitter Analysis.

Special Issue on: ICAIIS-2019 Smart Intelligent Computing and Communication Systems

  • Tangles in IOTA to make Crypto currency Transactions Free and Secure   Order a copy of this article
    by Prabakaran Natarajan 
    Abstract: Block-chain introduction has made a revolutionary change in the cryptocurrency around the world but it has not delivered on its promises of free and faster transaction confirmation. Serguei Popov proposal of using tangles, a directed acyclic graph which essentially is considered to be the successor of block-chains and offers the required features like machine to machine micro payment and feeless transaction. It requires the user to approve the previous two transactions in the web to participate in the network. This essentially eliminates the miners and the mining part form the currency exchange and provides the user or participants to do their transactions feeless. Since the participant verifies the previous two transactions it also contributes to the security of the tangle. In this paper features of IOTA and all the improvements in it using tangles are discussed along with how it contributed to the security and how it enables the participants to have feeless transactions is also discussed.
    Keywords: E-coin; Block-chain; Cryptocurrency; IOTA; Tangles; DLT; Feeless Transaction.

  • A Novel Filter for Removing Image Noise and Improving the Quality of Image   Order a copy of this article
    by Prathik A, Anuradha J, Uma K 
    Abstract: This paper proposed a Hybrid Wavelet Double Window Median Filter (HWDWM) which is made by blending Decision Based Coupled Window Median Filter and Discrete Wavelet Transform (DWT) and review is made to increase the filters which are widespread for removing noise. In proposed filter there are double window such as row window and column window. This proposed method take the noisy image for processing and it moves row window for indexing from 1st pixel of the noisy image up to last pixel of the noised image then it indexing is made by column window then decompose the signal of the image to provide the localization. The noisy image is decomposed by DWT, then coefficients are transformed to independent distributed variables. The coefficients are then analyzed on the basis of threshold. Image is reconstructed using wavelet transforms inverse after the threshold. Experiments were executed in order to show the effect of noise removal filters on soil image. Two metrics are used to measure the quality of image they are: peak signal to noise ratio (PSNR) and Root Mean Square Error (RMSE). Experimental results show the superiority of this filter over other noise removal filters.
    Keywords: Data mining; Soil Classifications; Filters; PSNR and MSE.

  • Implementation of Data Mining to Enhance the Performance of Cloud Computing Environment   Order a copy of this article
    by Annaluri Sreenivasa Rao, Attili Venkata Ramana, Somula Ramasubbareddy 
    Abstract: To deal with large scale computing events, the advantages of cloud computing are used extensively, whereby the possibility of machines processing larger data is possible to deliver in a scalable manner. Most of the government agencies across the globe are using the architecture of cloud computing platform and its application to obtain the desired services and business goals. However, one cannot ignore the challenges involved using the technology linked with large amount of data and internet applications (i.e. cloud). Though there are many promising advantages of cloud computing involving distributed and grid computing, virtualization, etc. helps the scientific community, also restricts with their limitations as well. One of the biggest challenges cloud computing faces is due to the exploitation of all the opportunities towards the security breaching and related issues. In this paper, an extensive mitigation system is proposed to achieve enhanced security and safer environment while using the applications of cloud computing. Using the decision tree model Chaid algorithm, it is proved to be a robust technique to classify and decision making by providing high end security for the cloud services. From the research of this work, it is proved that the standards, controls and policies are very important to the management processes for securing and protecting the data involved at the time of processing or application usage. Also a good management process needs to assess and examine the risks involved in cloud computing while protecting the system in use and data involved due to various security issues or exploits.
    Keywords: Cloud computing; security; Data mining; Multilayer percepton; decision tree (C4.5); Partial Tree.

  • Analysis of Breast Cancer Prediction and Visualization using Machine Learning Models   Order a copy of this article
    by Magesh G, Swarnalatha P 
    Abstract: Breast cancer is one of the most commonly occurring malignancies cancer in women, and there are millions of new cases diagnosed among womens and over 400,000 deaths annually worldwide. In our dataset, we have 30 real-valued attributes as features which are computed from the Fine Needle Aspirate (FNA) test. Our dataset values are calculated from the processed image of a first needle aspirate test of a breast mass. Our input values are extracted from the digitalized image of the FNA test. There are many algorithms used for prediction systems. We are choosing the best algorithms based on the precision result, accuracy, error rate. We are making a comparison of an effective way of applying algorithms and classifying data. We have different machine learning algorithms, a performance comparison conducted between those algorithms on the Breast Cancer datasets. Data visualization and descriptive statistics have presented. SVM with all features achieves 95% of precision, recall, and F1-score. After tuning the SVM parameters, accuracy has improved to 97%.
    Keywords: Breast Cancer; Machine Learning; Decision Tree; Classification; SVM; Prediction.

  • A comparative study on various preprocessing techniques and deep learning algorithms for text classification   Order a copy of this article
    by Bhuvaneshwari Petchimuthu, NagarajaRao A 
    Abstract: Preprocessing is the primary technique employed in sentiment analysis, and selecting the suitable methods in that techniques can increase the classifier accuracy. It reduces the complexity innate in the raw data which makes the classifier to learn faster and precisely. Despite of its importance, the preprocessing in polarity deduction has not attained much attention in the deep learning literature. So in this paper, 13 popularly used preprocessing techniques are evaluated on three different domain online user review datasets. For evaluating the impact of each preprocessing technique, four deep neural networks are utilized and they are auto-encoder, Convolution Neural Network (CNN), Long Short Term Memory (LSTM), and Bidirectional LSTM (BiLSTM). Experimental results on this study shows that using appropriate preprocessing techniques can improve the classification success. In addition, it is noted BiLSTM model performs better than the remaining neural networks.
    Keywords: ;sentiment analysis; deep learning; auto-encoder; convolution neural network; Long short term memory; Bidirectional LSTM.

  • An Optimal Selection of Virtual Machine for E-Healthcare Services in Cloud Data Centers   Order a copy of this article
    Abstract: In recent times, Cloud Computing plays a huge role in the processing of healthcare services. Such name that Electronic Healthcare services which are used to improve the healthcare performance in the cloud. A selecting and placing the virtual machine for healthcare service plays an important role and one of the challenges in the cloud. Huge levels of the data center are used to process the medical request. By doing these we would maximize the resource utilization and reduces the execution time of the medical request in the cloud data center. Multiple ways of techniques are used to solve the optimal issues in cloud resources. In this paper, a hybrid request factor-based multi-objective grey wolf optimization (RMOGWO) algorithm to solve the healthcare request in the cloud data centers efficiently. The proposed algorithm was tested and compared with the benchmark well-known algorithm for VM utilization in the cloud data centers. In addition, the efficiency of the Electronic healthcare services system in cloud performance increases in cloud utilization. Inaccuracy, the hybrid algorithm performs the maximum level of interaction with users. It is one of the superior models that improve resource utilization for healthcare services in the cloud.
    Keywords: Cloud Computing; Healthcare services; Virtualization; Multi-Objective Grey Wolf Optimization.

  • A Study on Automated Toll Collection: Towards the Utilization of RFID based System   Order a copy of this article
    by Naresh Kannan, Ranjan Goyal, Dhruv Goel 
    Abstract: The toll collection is becoming a major problem on the highways leading to large waiting queues. Toll gates installed on highways result in increased waiting time and fuel usage. In this paper, a study on Automated Toll Collection System using Radio Frequency Identification (RFID) based system is presented, which provides fast identification of vehicles and toll collection. Using this system, the identification can be done just by slowing down the speed of the vehicle when it is passing from the toll plaza. The RFID reader scans the RFID tag or card and deducts the amount from it. The research analyzed the system by proposing a mechanism and implementing an example scenario, which considered the random arrival of different vehicles. This technique is also compared with other existing mechanisms such as number plate recognition and bar code-based passes, which showed the need to utilize this technique for toll collection.
    Keywords: Automated toll collection; Radio Frequency Identification (RFID); RFID reader; RFID tag; Micro-controller.

  • A Microcontroller based System for Patient and Elderly Community Assistance   Order a copy of this article
    by Asmita Chotani, Naresh Kannan 
    Abstract: Generally, the elderly community is bedridden due to age parameters and health issues. Thus, there is a need for a system which can be an aid for this group. In this paper, a microcontroller-based system model is proposed. This system model assists the patients and elderly community by providing them the facility to satisfy their needs by informing the attenders/wards through a handheld device. Depending upon the frequency generated for a key press, a particular need is triggered among the set of predefined needs to the facilitator in terms of the audio from the device, which is mounted within his proximity.
    Keywords: DTMF; Mobile Phone; microcontroller; Arduino; patient; assistance.

  • The Big Data in Healthcare Industry Made Simple to Save People Life   Order a copy of this article
    by Vijay Anand R, Iyapparaja M 
    Abstract: In healthcare system big data is playing an important role andusing this data analysis to predict the outcomeof diseases prevention of effect of such additional disorders or diseases, transience and saving the cost of medical treatment. In many countries they diagnosis the diseases treatment big data playing a main role for information generate to identify the diseases. The main focus of Large information has started out and a few tasks were installed a place to share information of patients scientific records and perceive their records amongst fashionable public, non-public hospitals and clinics. However there are many challenges in conducting huge facts in healthcare specially in relation to privacy, protection, requirements, authority, integration of statistics, save the statistics, classify the data and to combine the generation. It's miles authoritative that these challenges to be overcome before huge facts can be implemented effectively in healthcare
    Keywords: Bigdata; Healthcare; Bayesian Network and Patients...

Special Issue on: IAIM2019 Advances in Data Science and Computing

  • Complex Event Processing for Effective Poultry Management in Farmlands with Predictive Analytics   Order a copy of this article
    by Imthyaz Sheriff, E. Syed Mohammed, Joshua J, Hema C 
    Abstract: This world is a bundle of events which are interconnected. The occurrence of one event may influence one or more events it is related to. The study of such real time events and their inter dependence is called complex event management. The challenge of complex event management is the ability to capture real time events, analyse and take decisions so as to make the system work in a most desired or optimum way. The focus of this study is on considering the real time happenings in a livestock management environment and applying predictive analytics after analysing the parameters that impact the complex event of effective livestock management. Predictive analytics is a form of data analytics which analyses both historical data as well as current live stream data to forecast the activities, behaviours and trends. Livestock management is a significant area for deploying predictive analytics as the behaviour patterns are highly varying and dependent on various complex events happening around. In Livestock management, our main focus is on Poultry. In this research work we have designed a system to continuously monitor the events happening in a poultry farm. Data is collected through sensors to detect the moisture content, light, time and weather conditions. Individual birds are RFID tagged to help in capturing the movement stream data. A cloud-based event and data management system has been developed and analysis is carried out on historical data as well as live stream data. The proposed model employs K-Means clustering algorithm for clustering the behaviour patterns of the poultry birds. Machine learning algorithms have been used to capture varied complex events that influence the well-being of the farm birds. The proposed system has been experimented on a real time farm with 846 country chickens. Our prediction algorithms have helped to achieve an accuracy of about 78%. The parameters that enormously impact on the behaviour of livestock management have been identified. Our system has been able to predict the unusual behaviour patterns in the livestock as well as foresee disease outbreak amongst chicken in the farm house. Our future work focuses on design and development of a complex event processing framework to cater to the effective management of livestock in a farm as a whole.
    Keywords: Predictive Analytics; Complex Event Processing; Data Mining; Data Analytics.