Forthcoming and Online First Articles

International Journal of Advanced Intelligence Paradigms

International Journal of Advanced Intelligence Paradigms (IJAIP)

Forthcoming articles have been peer-reviewed and accepted for publication but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.

Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.

Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.

Online First articles are published online here, before they appear in a journal issue. Online First articles are fully citeable, complete with a DOI. They can be cited, read, and downloaded. Online First articles are published as Open Access (OA) articles to make the latest research available as early as possible.

Open AccessArticles marked with this Open Access icon are Online First articles. They are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.

Register for our alerting service, which notifies you by email when new issues are published online.

We also offer which provide timely updates of tables of contents, newly published articles and calls for papers.

International Journal of Advanced Intelligence Paradigms (276 papers in press)

Regular Issues

  • Performance improvement in Cardiology department of a hospital by Simulation   Order a copy of this article
    by Shriram K. Vasudevan, Narassima Seshadri, Anbuudayasankar SP, Thennarasu M 
    Abstract: Healthcare industry plays a vital role in life of humankind and in economic development of a country. Healthcare services have to be provided to mankind as and when required without time delay and compromise on quality. This research focusses on reduction of waiting time of patients as it is considered as one of the important parameters that governs the service quality and is considered to improve patient satisfaction. This was achieved by performing a case study in Cardiology outpatient department of a private hospital in South India. Cardiology was chosen as it is one of the most critical areas which demands immediate attention. The study follows a Discrete Event Simulation approach for analysing the trajectory of patients in cardiology department, determining various performance parameters, suggesting changes in the existing system and developing alternate models to compare the results with those of existing model. Reducing waiting time permits physicians to address more number of patients in a given period which are evident from the results obtained from the developed models. Simulation results revealed that the four alternate systems proposed were effective than the existing system.
    Keywords: Discrete Event Simulation; Arena model; Healthcare; Cardiology; Outpatient department; Waiting time reduction;.

  • Thumb Movement for Prosthetic Hand based Fuzzy Logic   Order a copy of this article
    by Anilesh Dey, Amarjyoti Goswami, Abdur Rohman, Jamini Das, Nilanjan Dey, Amira S. Ashour, Fuqian Shi 
    Abstract: Electromyography innovation leads to the development of modern prostheses (artificial limbs) control. Prosthetic hands are developed to assist amputees during their daily activities. Over the years, it is seen that the fluid movements which are required to carry out different functions, such as gripping and holding are not reaching its full potential, especially in thumb movement pattern. Consequently, the current work proposed an efficient mechanism for the movement of the prosthetic thumb in order to position the thumb even at intermediate angles as 45.3 degrees and 78.6 degrees. Obtaining such flexibility in the movement will lead to a movement pattern which is more similar to the human hand. A fuzzy-based control strategy is implied to design a prosthetic thumb with the above-mentioned movement pattern. The Mamdani fuzzy control model is proposed with three input variables, namely the thumbs first joint bend, second joint bend and the second joint movement in the left and right direction. The proposed system provided the expected results, where twenty-seven combinations of the rules facilitate the alignment of the prosthetic thumb at different degrees.
    Keywords: Intermediate movements; Mamdani fuzzy control; Prosthetic thumb movement.

    by Thiyagarajan Venkatesh 
    Abstract: Global Positioning System (GPS) satellites produce low power signals that travel great distances to reach the receiver. To negate a GPS system, an adversary needs only to generate jamming signal with enough power and suitable temporal or spectral signature to deny the use of GPS throughout a given area. The first system developed to increase the GPS anti-jam capability for users on the ground or in the air was controlled reception pattern antenna. This device consists of an array of antenna elements. The elements are all connected to the electronics box that controls either the phase or gain or both and combines them to give a single output. From both military and civilian perspective it is important to establish an adequate anti-jamming capability for GPS systems and ensure availability of this asset in all environments. This was recognized by the military and resulted in the development of several mitigation techniques in time domain, time-frequency domain, Adaptive Antenna Arrays (AAA) and PC based software defined radio concepts. In this study, circular geometry of 5 patch antennas operating at L2=1.227GHz are designed and fabricated. Phase only nulling technique based on hybrid optimization is proposed and evaluated using IE3D software.
    Keywords: Global Positioning System (GPS); anti-jam; Adaptive Antenna Arrays (AAA); Circular geometry; patch antennas; Phase only nulling; Artificial Bee Colony (ABC) algorithm; Cuckoo Search (CS).

  • Using modified background subtraction for detecting vehicles in Videos   Order a copy of this article
    by Mohamed Maher Ata, Mohamed El-Darieby, Mustafa Abd El-nabi, Sameh A. Napoleon 
    Abstract: In this paper; a comparison study has been introduced between the traditional foreground detector based (background subtraction technique) and a modified background subtraction based (empty frame subtraction technique). Our case study was estimating average vehicular speed and the level of crowdedness in 3 test traffic videos with 5 different indices; frame rate, resolution, number of frames, duration, and extension). The proposed modification in the background subtraction detector strategy aims to reduce vehicle detection processing time which increase vehicle tracking efficacy. In addition, we have applied some sort of video degradations (salt and pepper noise, Gaussian noise, and speckle noise) to the appropriate traffic videos in order to evaluate the effect of a challenging weather condition case study on the detection processing time. This degradation has been applied in both traditional and modified background subtraction for detecting vehicles in traffic videos. Results show an obvious enhancement in the processing time of the detected vehicles according to this modification in the background subtraction of interest rather than the traditional background detector.
    Keywords: computer vision; foreground object detection; background subtraction; video degradation.
    DOI: 10.1504/IJAIP.2018.10018819
  • Content based load balancing of tasks using task clustering for cost optimization in cloud computing environment   Order a copy of this article
    by Kaushik Sekaran, P. Venkata Krishna 
    Abstract: Cloud computing is the recent mantra for all the techies and internet users all around the world. The power of cloud computing is enormous as it provides big services in an optimal cost as well as in a reliable manner. Load balancing of tasks in the cloud server is an important issue to be addressed. In this paper, we propose a task clustering algorithm to minimize the load across the cloud servers through content based load balancing of tasks using task clustering methods and cost reduction method for optimal energy consumption at all the cloud data center heads. The results analysed in our paper are better when compared with existing content based load balancing models. Our approach clearly represents the achievement of optimal load balancing of tasks with respect to upload bandwidth utilization, minimal latency and some other QoS (Quality of service) metrics.
    Keywords: Cloud computing; load balancing; tasks clustering; cost reduction; energy consumption; QoS (Quality of service) metrics.

  • Marker and Modified Graph Cut Algorithm for Augmented Reality Gaming.   Order a copy of this article
    by Shriram K. Vasudevan, R.M.D. Sundaram 
    Abstract: Augmented reality aims at superimposing a computer generated image on a users view of the real world thereby creating a composite view. Virtual reality on the other hand keeps the user isolated from the real world and immersed in a world that is completely fabricated. The main objective of this research is to capture a real life image and augment it as a component of a gaming environment using the principles of augmented reality. For this research implementation, we have chosen car racing as our gaming environment. The core elements are the image segmentation using CIELAB color space based graph cut algorithm, 2D to 3D modelling, and game development with augmented reality. The tools utilised are Mat Lab, insight3d and Unity3D.The proposed idea will enable someone to view a virtual environment with real components that are integrated dynamically.
    Keywords: Augmented Reality; Gaming; Image extraction; Modelling; Image segmentation; Racing.

  • Cross-corpus Classification of Affective Speech   Order a copy of this article
    by Imen Trabelsi, Mohammed Salim Bouhlel 
    Abstract: Automatic speech emotion recognition still has to overcome severalrnobstacles before it can be employed in realistic situations. One of these barriersrnis the lack of suitable training data, both in quantity and quality. The aim of thisrnstudy is to investigate the effect of cross-corpus data on automatic classification ofrnemotional speech. In thiswork, features vectors, constituted by the Mel FrequencyrnCepstral Coeffcients (MFCC) extracted from the speech signal are used to trainrnthe Support Vector Machines (SVM) and Gaussian mixture models (GMM). Wernevaluate on three different emotional databases from three different languagesrn(English, Polish, and German) following a three cross-corpus strategies. In thernintra-corpus scenario, the accuracies were found to vary widely between 70%rnand 87%. In the inter-corpus scenario, the obtained average recall is 70.87%. Thernaccuracies of the cross-corpus scenario were found to be below to 50%.
    Keywords: Cross corpus strategies; Speech emotion recognition; GMM; SVM;rnMFCC.

  • GA based efficient Resource allocation and task scheduling in multi-cloud environment   Order a copy of this article
    by Tamanna Jena, Jnyana Ranjan Mohanty 
    Abstract: Efficient resource allocation to balance load evenly in heterogeneous multi-cloud computing environment is challenging. Resource allocation followed by competent scheduling of tasks is of crucial concern in cloud computing. Load balancing is assigning incoming job-requests to resources evenly so that each involved resources are efficiently utilized. Number of cloud users are immense and volume of incoming job-request is arbitrary and data is enormous in cloud application. In cloud computing resources are limited, therefore it is challenging to deploy various applications with irregular capacities as well as functionalities in heterogeneous multi-cloud environment. In this paper Genetic Algorithm based task mapping, followed by priority scheduling in multi-cloud environment is proposed. The proposed algorithm has two important phases, namely mapping and scheduling. Performed rigorous simulations on synthetic data for heterogeneous multi-cloud environment. Experimental results are compared with existing First In First Out (FIFO) mapping and scheduling. Validity of mapping and scheduling clearly proves better performance of the entire system in terms of makespan time and throughput.
    Keywords: Load Balancing; Task Scheduling; Cloud Computing; multi-cloud environment; Genetic Algorithm.

  • Efficient and Secure Approaches for Routing in VANETs   Order a copy of this article
    by Marjan Kuchaki Rafsanjani, Hamideh Fatemidokht 
    Abstract: Vehicular ad hoc networks (VANETs) are a particular type of Mobile Ad Hoc Networks (MANETs). These networks provide communication services between nearby vehicles and between vehicles and roadside infrastructure that improve road safety and provide travelers' comfort. Due to the characteristics of VANET, such as self-organization, low bandwidth, variable network density, rapid changes in network topology, providing safe driving, enhancing traffic efficiency, etc., and the applications of them, problems related to these networks, such as routing and security, are popular research topics. A lot of research has been performed for providing efficient and secure routing protocol. In this paper, we investigate and compare various routing protocols based on swarm intelligence and key distribution in VANET.
    Keywords: Vehicular ad hoc networks (VANETs); Swarm intelligence; Routing protocols; Cryptography.

  • Analysis of Energy Efficiency Based on Shortest Route Discovery in Wireless Sensor Network   Order a copy of this article
    by Mohit Mittal 
    Abstract: Todays scenario is totally based on advancement of existing technologies to get more reliable wireless communication. Wireless sensor networks are one of the popular emerging technologies that are deployed commonly in harsh environment. These networks main dependency is on battery powers. Our mission is to reduce the energy consumption as much as possible. Every routing protocol has been designed for sensor network based on minimum energy consumption. In this paper, LEACH protocol has been modified with various shortest path algorithms to find out best performance of sensor network. Simulation result shows that Dijsktra algorithm has found to be better among other algorithms.
    Keywords: LEACH; Energy efficiency; Bellman-ford algorithm; Dijkstra algorithm; BFS algorithm.

  • Optimum Generation and VAr Scheduling on a Multi-Objective Framework using Exchange Market Algorithm   Order a copy of this article
    by Abhishek Rajan, T. Malakar 
    Abstract: This paper presents an application of Exchange Market Algorithm (EMA) in solving multi-objective optimization problems of power systems. This optimization algorithm is based on the activities of shareholders to maximize their profit in the Exchange Market. The uniqueness of this algorithm lies in the fact that, it enjoys double exploitation and exploration property unlike several other algorithms. In order to investigate its search capability, the EMA is utilized to solve power systems active and reactive related objectives simultaneously in presence of several non-linear constraints. Both optimum generation and VAr planning problems are formulated as conventional Optimal Power Flow (OPF) problem. Fuel cost (Active related objective), Transmission Line Loss and Total Voltage Deviation (reactive related objectives) are taken as different objective functions. The multi-objective optimization problem is performed through weighted sum approach. Both fuzzy and equal weight approach is utilized to declare the compromised solution. Programs are developed on MATLB and simulations are performed on Standard IEEE-30 & IEEE-57 bus systems. The search capability of EMA in solving the multi-objective power system problems are compared with PSO based solutions.
    Keywords: Optimal Power Flow; Exchange Market algorithm; multi-objective optimization; Pareto front; fuzzy decision making.

  • A Novel Three-Tier Model with Group Based CAC for Effective Load Balancing in Heterogeneous Wireless Networks   Order a copy of this article
    by Kalpana S, Chandramathi S, Shriram KV 
    Abstract: Seamless and ubiquitous connections are the ultimate objectives of 4G technologies. But due to randomised mobility and different service class of applications, the connection failure rate increases, which can be overcome through handover (HO). With the increased demand for handovers, the number of networks scanned for decision making and the number of negotiations for connectivity become too large. To improve their efficiency, a three tier model is proposed, where requests for similar type are grouped and a common negotiation is made to reduce the number of communication messages. Only qualified networks among all the reachable access points are chosen for decision. Handover need estimation is performed to reduce the unwanted handovers. Finally, an adaptive resource management is made possible through a group based call admission control (GB-CAC) algorithm that harmonises up to 50 percent of the resource utilisation, ensuring higher numbers of connections with negligible percent call blocking and dropping.
    Keywords: Point of Attachment; handover; candidate networks; elimination factor; queues; Quality of Service; Smart Terminal.

  • Intricacies in Image steganography and Innovative Directions   Order a copy of this article
    by Krishna Veni, Sudhakar P 
    Abstract: With the advancement in digital communication and data sets getting huge due to computerization of data gathering worldwide, the need for data security in transmission also increases. Cryptography and steganography are well known methods available to provide security where the former use techniques that control information in order to cipher or hide their presence and the latter concentrates on data concealment. Steganography is the practice of masking data especially multimedia data within another data. Visual contents gets more importance from people compared to audio contents and moreover visual content file is huge when compared to audio file thereby helping increase robustness of the hiding algorithms. In this paper, we consider three domains in which the image steganography algorithms are proposed along with the experimentation results on USC-SIPI image database which prove the betterment of the algorithms as compared with the traditional algorithms. We propose to use rule based LSB substitution method in spatial domain, XOR based hiding in frequency domain and data encryption standard based embedding in wavelet domain. We find that the proposed algorithms have a better PSNR value averaging close to 53 after embedding the secret data, while the existing algorithms has values of around 50.
    Keywords: Peak Signal to Noise Ratio; Quantization; Discrete Cosine Transformation; Wavelet; Steganalysis; cipher text;.

  • Using Lego EV3 to Explore Robotic Concepts in a Laboratory   Order a copy of this article
    by Jeffrey W. Tweedale 
    Abstract: During a recent Massive Open On-line Course (MOOC) at the Queensland University of Technology (QUT) titled an Introduction to Robotics, a young student used the forum to question the skills required to gain employment. The resounding response was the need for multiple disciplines that typically included mechatronics, software, mechanical and electrical/electronics engineering. Similarly the curriculum focused on professional systems and the scientific rigour involved in their evolution. This limits the growing community of enthusiasts and keen observers seeking greater involvement as they are often constrained by the lack of Science Technology Engineering and Maths (STEM) skill sets. For these reasons a means of accelerating the learning of key concepts is required as well as a mechanism of providing cheap and reliable access to the tools and techniques required to participate. AlthoughLEGOMindstorms is considered a toy that has traditionally been targeted toward the 8-14 year group of children, it does cater for enthusiasts and is increasingly being used to support STEM initiatives. Because of its low cost and availability, Mindstorms was recently used as the focal solution in the MOOC course to enable every student to demonstrate robotic concepts independent of the pre-requisite skills. This raises a new question about how useful LEGO can be used to explore robotic concepts in a laboratory. The course shows it can be used for sensor development and was successfully used to enhance conceptual learning for the uninitiated (enthusiast, interested observer, undergraduate, post-graduate and even those being integrated within the domain).
    Keywords: Cartesian Coordinates; Forward Kinematics; Inverse Kinematics; Lego; Mindstorms; Robotics.

  • Applying Genetic Algorithm to Optimize the Software Testing Efficiency with Euclidean Distance   Order a copy of this article
    by Rijwan Khan 
    Abstract: Software testing ensures that a developed software is error free and reliable for customer use. For verification and validation of software products, testing has been applied on these products in various different software industries. So before the delivery of the software to the customer, all the types of testing have been applied. In this paper, automatic test cases have been developed with the help of a genetic algorithm for data flow testing and these tests are divided in different groups using Euclidean distance. Elements of each group are applied on the data flow diagram of the program/software and all the du paths are found, covering the given test suits. New test suits are generated with the help of the genetic algorithm to cover all du-paths.
    Keywords: Software Testing; Automatic test cases; Data flow testing; Genetic Algorithm.

  • Dominion Algorithm- A novel metaheuristic optimization method   Order a copy of this article
    by Bushra Alhijawi 
    Abstract: In this paper, a novel bio-inspired and nature-inspired algorithm, namely Dominion Algorithm is proposed for solving optimization tasks. The fundamental concepts and ideas which underlie the proposed algorithm is inspired from nature and based on the observation of the social structure and collective behavior of wolves pack in the real world. Several experiments were preformed to evaluate the proposed algorithm and examine the correlation between its main parameters.
    Keywords: Dominion Algorithm; Metaheuristic methods; Biologically-inspired algorithm; Artificial intelligence.

  • Fitness Inheritance in Multi-objective Genetic Algorithms: A Case Study on Fuzzy Classification Rule Mining.   Order a copy of this article
    by Harihar Kalia, Satchidananda Dehuri, Ashish Ghosh 
    Abstract: In this paper, the trade-off between accuracy and interpretability in fuzzy rule-based classifier has been examined through the incorporation of fitness inheritance in multi-objective genetic algorithms. The aim of this mechanism is to reduce the number of fitness evaluation spared by estimating the fitness value of the offspring individual from the fitness value of their parents. The multi-objective genetic algorithms with efficiency enhancement technique is a hybrid version of Michigan and Pittsburgh approaches. Each fuzzy rule is represented by its antecedent fuzzy sets as an integer string of fixed length. Each fuzzy rule-based classifier, which is a set of fuzzy rules is representedrnas a concatenated integer string of variable length. Our algorithm simultaneously maximizes the accuracy of rule sets and minimizesrntheir complexity (i.e., maximization of interpretability). As a result of adopting fitness inheritance, it minimizes the total fitness computation time (i.e., overall time to generate rule set). The accuracyrnis measured by the number of correctly classified training samples,rnwhile the rule complexity is measured by the number of fuzzy rulesrnand/or the total number of antecedent conditions of fuzzy rules. Thernefficiency enhancement technique such as fitness inheritance is usedrnto minimize the overall computation time of generating the rule set.rnWe examine our method through computational experiments on somernbenchmark datasets. The experimental outcome conforms that thernproposed method reduces the computational cost, without decreasingrnthe quality of the results in a significant way.
    Keywords: Classification; fuzzy classification; multi-objective genetic algorithm; fitness inheritance; accuracy; and interpretability.

  • Geometric Based Histograms for Shape Representation and Retrieval   Order a copy of this article
    by Nacera Laiche, Slimane Larabi 
    Abstract: In this paper, we present a new approach for shape representation and retrieval based on histograms. In the drawback of the proposed histograms descriptor, we consider the concept of curves points. This integration in the proposed histogram-based approach is quite different since geometric description is stored in histograms. The proposed description is not only effective and invariant to geometric transformations and deformations, but also is insensitive to articulations and occluded shapes as it has the advantage of exploring the geometric information of points. The generated histograms are then used to establish matching of shapes by comparing their histograms using dynamic programming. Experimental results of shape retrieval on different kinds of shape databases show the efficiency of the proposed approach when compared with existing shape matching algorithms in literature.
    Keywords: Log-polar histogram; Least squares curve; High curvature points; Shape description; Shortest augmenting path algorithm; Shape retrieval.

  • Improved biogeography-based optimization   Order a copy of this article
    by Raju Pal, Mukesh Saraswat 
    Abstract: Biogeography-based optimization (BBO) is one of the popular evolutionary algorithms, inspired by the theory of island biogeography. It has been successfully applied in various real world optimization problems such as image segmentation, data clustering, combinatorial problems, and many more. BBO finds the optimal solution by using two of its main operators namely; migration and mutation. However, sometimes it traps into local optimum and converges slowly due to poor population diversity generated by mutation operator. Moreover, single feature migration property of BBO gives poor performance for non-separable functions. Therefore, this paper introduces a new variant of BBO known as improved BBO (IBBO) by enhancing its migration and mutation operators. The proposed variant successfully improves the population diversity and convergence behavior of BBO as well as shows better solutions for non-separable functions. The performance of proposed variant has also been compared and analyzed with other existing algorithms over 20 benchmark functions.
    Keywords: Evolutionary algorithm; Biogeography-based optimzation; Migration operator; Mutation operator.
    DOI: 10.1504/IJAIP.2018.10022960
  • Sequential Pattern based Activity Recognition model for Ambient Computing   Order a copy of this article
    by J. Gitanjali, Muhammad Rukunuddin Ghalib 
    Abstract: In the recent years, the human activity recognition gain popularity in ambient computing. The human activity recognition is composed of identifying the daily activities of the users by observing their actions. Action identification is more complex task from senor data generated by each sensor. In this paper, sequential pattern based activity recognition is proposed for identifying sequential patterns among actions on the given dataset. This support value is used as a parameter to validate the sequence. The experimental evaluation is performed on the real time dataset and it is observed that the sequential pattern approach is very beneficial in reducing the execution time and increasing the classification accuracy of the classifiers.
    Keywords: Action; Activity; sensor based data; sequence patterns; classifiers.
    DOI: 10.1504/IJAIP.2018.10033336
  • Evaluation of Large Shareholders Monitoring or Tunneling Behavior in Companies Accepted in Tehran Stock Exchange   Order a copy of this article
    by Sahar Mojaver 
    Abstract: Shareholders' wealth in the real world of finance is very important and focus on it has become very important in recent years. Although the purpose of each investment and consequently, the main purpose of each company has been maximizing shareholder wealth but over the past decades, most companies have not paid enough attention to it. Ownership composition, particularly the ownership concentration of majority shareholders is one of the most important factors influencing on the control and managing companies. When large shareholders or internal shareholders like managers have the capacity to control the company, they may have some incentives to get private benefits. Given the importance of monitoring and behavior of controlling shareholders, this study investigates the large shareholders monitoring or tunneling behavior in companies accepted in Tehran Stock Exchange. To do so, 125 companies over the period of 2010 to 2011 (a total of 750 years- company) are analyzed using systematic elimination sampling method. Results show that there is a significant relationship between large shareholders tunneling behavior and financial performance (return on equity and Tobin's Q indexes) in companies accepted in Tehran Stock Exchange, and this relationship is U shaped.
    Keywords: Tunneling Behavior; Large Shareholders; Companies Accepted in Tehran Stock Exchange.

  • Resource discovery in inter-cloud environment: A Review   Order a copy of this article
    by Mekhla Sharma, Ankur Gupta, Jaiteg Singh 
    Abstract: The Inter-cloud has emerged as a logical evolution to cloud computing extending computational scale and geographic boundaries through collaboration across individual Cloud Service Providers (CSPs). Resource discovery in this large-scale, distributed and highly heterogeneous environment remains a fundamental challenge to enable effective cross-utilization of resources and services. This review paper examines various resource discovery approaches in the inter-cloud outlining key challenges. Finally, the paper presents some ideas to build effective and efficient resource discovery strategies for the inter-cloud.
    Keywords: inter-cloud resource discovery; inter-cloud challenges; resource discovery challenges; resource discovery approaches.
    DOI: 10.1504/IJAIP.2018.10023054
  • Building a Simulated Educational Environment for the Diagnosis of Lumbar Disk Herniation Using Axial View MRI Scans   Order a copy of this article
    by Mohammad Alsmirat, Khaled Alawneh, Mahmoud Al-Ayyoub, Mays Al-dwiekat 
    Abstract: Computer-aided diagnosis systems have been the focus of many research endeavors. They are based on the idea of processing and analyzing various types of inputs (such as patients medical history, physical examination results, images of different parts of the human body, etc.) to help physicians reach a quick and accurate diagnosis. In addition to being a great asset for any hospital (especially the less fortunate ones with no or with small number of radiologists), such systems represent invaluable platforms for educational and research purposes. In this work, we propose a system for the diagnosis and training on the diagnosis of lumbar disk herniation from Magnetic Resonance Imaging (MRI) scans. The proposed system has three main novel contributions. First, it utilizes the axial MRI spine view of the suspected region instead of using MRI sagittal spine view. Axial view is usually more accurate and provides more information about lumbar disk herniation. Second, instead of simply classifying cases as normal or abnormal, the proposed system is capable of determining the type of lumbar disk herniation and pinpoint its location. To the best of our knowledge, this is the first work to address the problem of determining the type and location of lumbar disk herniation based on the axial MRI spine view. The final contribution of this work is the simulated training environment, that can be used to train novice radiologists on the diagnosis of lumbar disk herniation. The experiments conducted to evaluate the system show that it is quick and accurate besides being very useful for training purposes.
    Keywords: Axial MRI Spine View; Classification; Computer-aided Diagnosis; Feature Extraction; Lumbar Disk Herniation; ROI Enhancement; ROI Extraction.

  • Types of fuzzy graph coloring and polynomial ideal theory   Order a copy of this article
    by Arindam Dey, Anita Pal 
    Abstract: The graph coloring problem (GCP) is one of the most importantrnoptimization problems in graph theory. In real life scenarios, many applicationsrnof graph coloring are fuzzy in nature. Fuzzy set and fuzzy graph can manage thernuncertainty, associated with the information of a problem, where conventionalrnmathematical models/graph may fail to reveal satisfactory result. To include thosernfuzzy properties in solving those types of problems, we have extended the variousrntypes of classical graph coloring methods to fuzzy graph coloring methods. Inrnthis study, we describe three basic types of fuzzy graph coloring methods namely,rnfuzzy vertex coloring, fuzzy edge coloring and fuzzy total coloring.We introducerna method to color the vertices of the fuzzy graph using the polynomial idealrntheory and find the fuzzy vertex chromatic number of the fuzzy graph. A practicalrnexample of scheduling committees meeting is given to demonstrate our proposedrnalgorithm.
    Keywords: Fuzzy graph; Fuzzy coloring; Chromatic number; Polynomialrnideal; Groebner basis.
    DOI: 10.1504/IJAIP.2018.10009343
  • Design and analysis of SRRC filter in wavelet based multiuser environment of mobile WiMax   Order a copy of this article
    by Harpreet Kaur, Manoj Kumar, Ajay K. Sharma, Harjit Pal Singh 
    Abstract: Wavelets amid its capability to provide simultaneous information in both time and frequency domain along with minimization of interference and improved bandwidth efficiency is considered as an efficient approach to replace Fast fourier transform (FFT) in the conventional Orthogonal Frequency Division Multiplexing (OFDM) systems. To improve the Quality of service (QoS) in such systems spectrally efficient filter pulses are employed in order to mitigate the effect of inter-symbol interference (ISI) as well as thy satisfy the bandwidth limitations imposed by the multipath fading channels. Morever by allowing multiple users to utilize the transmission channel at the same time aspires towards achieving optimal resource allocation with acceptable error rates considering undesirable effects of correlated fading in the channel. In this paper, multi user environment is simulated in wavelet based OFDM for Wimax system with SRRC pulses employed as transmit and receive filters to perform matched filtering. The performance analysis in terms of Bit Error rate (BER) as a function of Signal to Noise Ratio (SNR) is investigated by varying number of users for the purpose of comparing their relative performances for various modulation schemes under AWGN channel. The simulation outcome substantiates that implementation of multiuser environment while overcoming co-channel interference elevates channel capacity and meet higher data rate demand along with effective utilization of the spectral resources. This simulation model is developed in MATLAB.
    Keywords: DWT; OFDM; Square Root Raised Cosine,;Pulse shaping filter; multiuser; mobile WiMax.
    DOI: 10.1504/IJAIP.2018.10023307
  • A hybrid approach for improving data classification based on PCA and enhanced ELM   Order a copy of this article
    by Doaa L. El-Bably, Khaled M. Fouad 
    Abstract: The efficient and effective process of extracting the useful information from high-dimensional data is a worth studying problem. The high-dimensional data is a big and complex that it becomes difficult to be processed and classified. Dimensionality reduction (DR) is an important and a key method to address these problems.rnThis paper presents a hybrid approach for data classification constituted from the combination of principal component analysis (PCA) and enhanced extreme learning machine (EELM). The proposed approach has two basic components. Firstly, PCA; as a linear data reduction, is implemented to reduce the number of dimensions by removing irrelevant attributes to speed up the classification method and to minimize the complexity of computation. Secondly, EELM is performed by modifying the activation function of single hidden layer feed-forward neural network (SLFN) perfect distribution of categories. rnThe proposed approach depends on a static determination of the reduced number of principal components. The proposed approach is applied on several datasets and is assisted its effectiveness by performing different experiments. For more reliability, the proposed approach is compared with two of the previous works, which used PCA and ELM in data analysis.rn
    Keywords: Data mining; Data classification; Principal component analysis (PCA); Neural Network; Extreme Learning Machine (ELM).
    DOI: 10.1504/IJAIP.2018.10013881
  • A Comprehensive Review on Time Series Motif Discovery using Evolutionary Techniques   Order a copy of this article
    by RAMANUJAM ELANGOVAN, Padmavathi S 
    Abstract: Time series data are produced daily in a large quantity virtually in most of the fields. Most of the data are stored in time series database. Time Series Motif is a frequent or recurrent or unknown pattern that occurs in a time series database used to aid in decision making process. The Time series motif mining is a useful technique for summarizing additional techniques like classification or clustering process. Recently, diverse techniques have been proposed for time series motif discovery techniques. In which, this paper explores the time series motif discovery using evolutionary techniques in various real time data with its characteristics. The primary aim of this research is to provide a glossary for interested researchers in time series motif discovery and to aid in identifying their potential research direction using evolutionary techniques.
    Keywords: Time Series; Motif; DataMining; Evolutionary techniques; Genetic Algorithm;.

  • Performance index assessment of intelligent computing methods in e-learning systems   Order a copy of this article
    by Aditya Khamparia, Babita Pandey 
    Abstract: In the current advancing and smart growing technology, e-learning system strikes the most dominant position for the learning style. Many research studies have evaluated e-learning system by using various criteria like prediction accuracy, satisfaction degree, pre-post analysis etc., but none of the results have explored the common methodology for appraising such systems. The proposed research work is focused on resolving the drawbacks of common benchmarks for evaluating the performance of e-learning system by including the Importance (I), Complexity(CC) and also determined the measurements of different learning problems and learning techniques. Finally the Performance Index (PI) is computed on the basis of I and CC, which is represented on graph with comparative view of Importance (I), Complexity (CC) and Performance Index (PI) for all the models.
    Keywords: LPI;LPM;LTCC;LPCC;PI.
    DOI: 10.1504/IJAIP.2018.10016273
  • Heterogeneous Mixing of Dynamic Differential Evolution Variants in Distributed Frame work for Global Optimization Problems   Order a copy of this article
    by G. Jeyakumar, C. Shunmuga Velaytham 
    Abstract: Differential Evolution (DE) is a real parameter optimization algorithm added to the pool of algorithms under Evolutionary Computing field. DE is well known for simplicity and robustness. The Dynamic Differential Evolution (DDE) was proposed in the literature as an extension to DE, to alleviate the static population update mechanism of DE. Since the island based distributed models are the natural extension of DE to parallelize it with structured population, they can also be extended for DDE. This paper, initially, implements distributed versions for 14 variants of DDE and also proposes an algorithm hmDDEv (heterogeneous mixing of dynamic differential evolution variants) to mix different DDE variants in island based distributed model. The proposed hmDDEv algorithm is implemented and validated against a well defined benchmarking suite with 14 benchmarking functions, by comparing it with its constituent DDE variants. The efficacy of hmDDEv is also validated with two state-of-the-art distributed DE algorithms.
    Keywords: Dynamic Differential Evolution; Island Models; Distributed Algorithm; Mixed Variants.
    DOI: 10.1504/IJAIP.2018.10012580
  • A new Approach For Automatic Arabic-Text Detection and Localization in video frames   Order a copy of this article
    by Sadek Mansouri, Mbarak Charhad, Mounir Zrigui 
    Abstract: Text embedded in video frames provides useful information forrnsemantic indexing and browsing system. In this paper, we propose an efficientrnapproach for automatic Arabic-text detection which combines edge informationrnand Maximally Stable Extremal Region (MSER) method in order to extract textrnregion candidates. These regions are, then, grouped and filtered on the b asisrnof geometric properties such as area and orientation. Besides, we introduce arnnew geometric descriptor of Arabic text called baseline to improve the filtering process. Our proposed approach was tested on a large collection of Arabic TV news and experimental results have been satisfying.
    Keywords: Arabic text detection;Arabic news; baseline estimation;MSER.

  • Proposed Enhancement for Vehicle Tracking in Traffic Videos Based Computer Vision Techniques   Order a copy of this article
    by Mohamed Maher Ata, Mohamed El Darieby, Mustafa Abdelnabi, Sameh A. Napoleon 
    Abstract: In this paper, Traffic video enhancement has been approached according to means of computer vision algorithms. We have measured the average number of tracks which assigned correctly in the whole video. These tracks express the correct prediction of vehicles that guarantee the keep track process of each vehicle from the first frame until the end frame. In addition, some video degradations (i.e. salt & pepper, speckle, and Gaussian noise) have been applied in order to measure the effect of these degradations on the tracking efficacy. Some filtering systems have been applied to the degraded traffic video in order to conclude the best filter mask which satisfies the least deviation in the value of assigned tracks. Experimental results show that both wiener and disk filters are the best mask for salt and pepper video degradation. However, median filter mask is the best choice for both speckle and Gaussian video degradations.
    Keywords: Video disturbance; Prediction; Assigned track; GMM; Spatial filtering.

  • Speckle Noise Reduction in SAR Images using TypeII NeuroFuzzy Approach   Order a copy of this article
    by S. Vijayakumar, V. Santhi 
    Abstract: Synthetic Aperture RADAR (SAR) images play a vital role in remote sensing applications and thus it insist the requirement of quality enhancement as it gets affected with speckle noise. It is a kind of noise that gets multiplied with pixel intensities due to interference of backscattering signal. In this paper, computational intelligence based approach is proposed to remove speckle noise by preserving edges and texture information. In particular, the proposed system uses typeII Neuro-Fuzzy approach using pixel neighbourhood topologies. The performance efficiency of the proposed system is proved by comparing its results with existing methods.
    Keywords: SAR Image; Speckle Noise; Fuzzy Logic System; Artificial Neural Network Approach; Noise Reduction; Gaussian Model.
    DOI: 10.1504/IJAIP.2021.10036168
    by R. Aswini, Praveen Kumar Rajendran, A. Piosajin 
    Abstract: Data mining is the methodology which discovers useful and hidden information from large databases. Many Researchers have proposed innumerable algorithms in the field of data mining. In this system Improvised UP-Growth is considered for mining high utility itemset from Potential High Utility Itemset and improvised under different constraints. The Node Utility and Reorganized Transaction Utility are the aspects considered as the key term in the proposed system which are manipulated using the technique as in UP-growth. However mining Potential High Utility Itemset from RTU using UP-Growth needs number of tree traversals. This is reduced in the proposed system by introducing bottom up approach and merging certain manipulations. However, working the system as a sequential process will be time consuming. Distributed environment is considered in the proposed system to overwhelm the problem in existing methodology.
    Keywords: Node utility; Transaction Utility;Transaction Weight Utility;Reorganized Transaction Utility;Potential High Utility Itemset.
    DOI: 10.1504/IJAIP.2018.10015437
  • An Enhanced Secure Data Aggregation Routing Protocol for Sensor Networks   Order a copy of this article
    by A.L. SREENIVASULU, Chenna Reddy P 
    Abstract: From the past decade, the utilization of sensor devices in the real world applications is increased rapidly. To meet the demand of applications, the sensor nodes are deployed in remote areas where the operation is very complex. The security of the sensor nodes will be compromised at any time. Therefore, a secure data aggregation mechanism is needed to overcome their limitations. In this paper, a secure data aggregation mechanism is proposed for securing the data from unauthorized access. The proposed method concentrated on three modules such as data encryption, data aggregation and data decryption. Additionally, the data aggregation module concentrated on removing the redundant data for minimizing the energy consumption of the sensor nodes. The proposed method is evaluated under different conditions. The proposed method showed superior performance in terms of reducing the communication overhead, minimizing the difference in energy consumption and increased the data aggregation accuracy.
    Keywords: Data Communication; Aggregation; Encryption; Security; Sensor nodes.

  • An Efficient Approach towards Building CBIR Based Search Engine for Embedded Computing Board   Order a copy of this article
    by Shriram K Vasudevan, P.L.K. Priyadarsini, Sundaram RMD 
    Abstract: Investigating a picture gives us more information than how it is expressed through words. Image processing is such a field which is ever booming and handles n number of images. Thanks to technology we are able to store and retrieve such a massive data set of image based data from anywhere. Search engines provide a way to link images and queries. They are searched using various factors like keyword, image dimension, texture etc. which is called as content based image retrieval. In this search methodology, the input query image is analysed and its properties or features are saved. Using the recorded features, other images are retrieved which match with the input image. But then again, searching by just name, colour or texture is not very efficient and so we have proposed a novel algorithm for the same. The proposed algorithm takes features like colour, texture, SURF, entropy etc. and finds out how differently they work and what distinct results they produce when combined. Implementation of CBIR on Beagle board led us to some satisfactory results, which encouraged us to do further research.
    Keywords: Retrieval; Wavelet; Histogram; Texture; OpenCV; MATLAB; Region of interest.

    by Ali Asghar Talebi, Samaneh Omidbakhsh 
    Abstract: In this paper, we introduce the concept of Cayley bipolar fuzzy graphs on the bipolar fuzzy groups. Also some properties of Cayley bipolar fuzzy graphs as connectivity, transitivity are provided.
    Keywords: Bipolar fuzzy groups; Cayley fuzzy graphs; isomorphism.

  • Impact of multimedia in learning profiles   Order a copy of this article
    by Ariel Zambrano, Daniela Lopez De Luise 
    Abstract: The present paper has as original contribution the definition of an automated model of the behavior of a user against a certain type of images in a context of playful learning. Therefore, the Entropy is used to classification profiles, starting from temporary information, which is mixed with certain characteristics previously extracted from the images. The aim of all this is to determine to what extent visual images trigger functions of comprehension and abstraction on topics of high degree complexity. As part of the obtained model, is intended to generate learning profiles, which will enrich in the future with other Non Invasive device, to observe the behavior of the user. For example: cameras, monitory keyboard, mouse, and use among others. The profiles are discovered and described with the minimum information needed. The collected information is processed with Bio Inspired techniques, which are essentially bases on Deep Learning concept.
    Keywords: Audiovisual Techniques; Engineering Teaching; Video Games; Learning Model; Deep Learning; Multimedia; Data Mining.

  • Domination number of complete restrained fuzzy graphs   Order a copy of this article
    by R. Jahir Hussain, S. Satham Hussain, Sankar Sahoo, Madhumangal Pal 
    Abstract: This work is concerned with the restrained complete domination number and triple connected domination number of fuzzy graphs. Some basic definitions and needful results are given with an example. The necessary and sufficient conditions for the fuzzy graph to be complete restrained domination set is formulated and proved. Also the relation between complete restrained domination set and $n$-dominated set is illustrated. Finally, triple connected domination number of a restrained complete fuzzy graph is provided.
    Keywords: Fuzzy graphs; Complete restrained domination set; Complete restrained domination number; Triple connected domination number.

  • Rock Hyrax Intelligent Optimization Algorithm: An Exploration for Web 3.0 Domain Selection   Order a copy of this article
    by B. Suresh Kumar, Deepshikha Bharghava, Arpan Kumar Kar, Chinwe Peace Igiri 
    Abstract: Currently, the immense growth of internet usage has become a bottleneck situation for web developers to meet the customer requirements. To analyze this changing scenario, the developers need to meet these requirements through the introduction of various optimization techniques. Various enumerable optimization techniques are available in the market to explore the Web 3.0 domain. In this research, the author proposed a new metaheuristic approach that aimed at providing an appropriate solution to the analysis and optimization issues. The main aim to design this algorithm in spite of existing algorithms is for wider search space and less time for optimization as based on the foraging time by Rock Hyraxes. Here, the swarm intelligence metaheuristic approach is proposed based on the biological behavior of Rock Hyrax available in East Africa. This novel Rock Hyrax Intelligent Optimization Algorithm (RHIO) is used to optimize the results in the Web 3.0 domain.
    Keywords: Metaheuristics; Web 3.0; Optimization; Swarm Intelligence; Rock Hyrax Intelligent Optimization (RHIO).
    DOI: 10.1504/IJAIP.2021.10032230
  • A Novel Statistical Approach to an Event Management A study and analysis of a Techfest with suggestions for improvements.   Order a copy of this article
    by Narassima Seshadri, Shriram KV 
    Abstract: Events play a vital role in day-to-day life, either in a casual or a professional manner. Some formal events that occur routinely over a period of time, needs to be successful to become sustainable. Event management strategies vary consistently as choices of different people and even same people change as time progresses. Educational institutions showcase their talents by organizing annual fests, gather likeminded people from various institutions to exhibit their talents and gain knowledge. These events need to be successful in order to attract audience and sustain over a long time. This study aims to study about various aspects of Anokha 2016, the sixth annual Techfest of Amrita School of Engineering, so as to improve Anokha 2017. The paper investigates various aspects that remained favorite and also the aspects that were not up to the expectations of the participants, and the suggestions to improve these aspects have also been discussed.
    Keywords: Event management; Techfest; Educational institution; Reliability analysis; Construct validity; Hypothesis testing;.

  • A hybrid approach of Missing Data Imputation for Upper Gastrointestinal Diagnosis   Order a copy of this article
    by Khaled Fouad 
    Abstract: Gastrointestinal and liver diseases (GILDs) are the major causes of death and disability in Middle East and North Africa. The investigation of upper gastrointestinal (GI) symptoms of a medically limited area resource is a challenge. The real-world clinical data analysis using data mining techniques often is facing observations that contain missing values saved for number of attributes. The main challenge of mining real clinical dataset of upper GI to diagnose the diseases, is the existence of missing values. The missing values should be first tackled to achieve high accurate and effective results of data mining approach for diagnosing and predicting upper GI diseases. In this paper, the proposed approach to missing data imputation is accomplished to pre-process the real clinical dataset of upper GI to apply the feature selection and classification algorithms with accurate and effective results. These accurate and effective results will provide accurate diagnosing and predicting for upper GI diseases. The proposed approach aims at tackling the missing data onto the upper GI categorical dataset and enhancing the accuracy of the classifiers by exploiting the feature selection method before imputation process. This approach is evaluated by implementing experimental framework to apply five phases. These phases aim at partitioning the dataset to eight different datasets; with various ratio of missing data, performing the feature selection, imputing the missing data, classifying the imputed data, and finally, evaluating the outcome using k-fold cross validation for nine evaluation measures.
    Keywords: Data mining; Data classification; Feature selection; Missing data imputation; Categorical Data mining; Diagnosis of upper GI diseases.

  • Evaluation Method based on a Tracing Mechanism for Adaptive User Interfaces: Application in Intelligent Transport Systems   Order a copy of this article
    by Makram Soui, Soumaya Moussa, Christophe Kolski, Mourad Abed 
    Abstract: Nowadays, Adaptive User Interfaces (AUI) are more and more present everywhere in our daily life activities (at home, at work, in public places, etc.). Moreover, they can have different adaptation capabilities, can be disseminated in the environment of the users, and take into account different user profiles. Many academic and industrial studies are conducted about user modelling, design methods and tools for User Interface (UI) generation. However, the evaluation of such user interfaces is difficult. In fact, there exist relatively few works in the literature about such AUI evaluation. To fill up this lack, it is necessary to envisage new evaluation methods focused on adaptation quality of UI. In this research work, we propose an evaluation method called MetTra (METhod based on a TRAcing system). This method has been validated by evaluating AUIs in the transportation field.
    Keywords: Adaptive User Interface (AUI); Evaluation; MetTra; Intelligent Transport Systems (ITS).
    DOI: 10.1504/IJAIP.2018.10019161
  • Types of uncertain nodes in a fuzzy graph   Order a copy of this article
    by Arindam Dey, Anita Pal 
    Abstract: The graph theory has numerous applications in the problems of operations research, economics, systems analysis, and transportation systems. However, in real applications of a graph theory are full of linguistic vagueness, i.e., uncertainty. For e.g., the vehicle travel time or number of vehicles on a road network may not be known precisely. In those types of problem, fuzzy graph model can be used to deal those uncertainties. In a fuzzy graph, it is very important to identify the nature (strength) of nodes and no such analysis on nodes is available in the literature. In this paper, we introduce a method to find out the strength of the node in a fuzzy graph. The degree of the node and maximum membership value of the adjacent edges of that node are used to compute the strength of the node. The strength of a fuzzy node itself is a fuzzy set. Depending upon the strength of the nodes, we classify the nodes of a fuzzy graph into six types namely α- strong fuzzy node, - strong fuzzy node, regular fuzzy node, α- weak fuzzy node, - weak fuzzy node and balance fuzzy node.
    Keywords: Keywords: Fuzzy graph; fuzzy node; Strength of node; vagueness of object.

  • A kernel based SVM for Semantic Relations Extraction from Biomedical Literature   Order a copy of this article
    by Kanimozhi Uma 
    Abstract: To identify and extract semantic relationships among named entities, relation extraction is a significant approach in knowledge representation. In order to capture the semantic as well as syntactic structures in text and to enable deep understanding of biomedical literatures, relation expression become essential. The automatic extraction of disease gene relations is presented by utilizing shallow linguistic features of global and local word sequence context with string kernel based supporting vector machine (SVM) for efficient disease-gene relation extraction. The performance of the proposed work shows that the bag-of-features kernel-based SVM classification is a promising resolution for specific disease-gene association mining.
    Keywords: Biomedical Relation Extraction; Natural Language Processing; Machine Learning; Biomedical Literature.

  • Implementing RSA algorithm for network security using Dual Prime Secure Protocol (DPSP) in crypt analysis   Order a copy of this article
    by R. Durga, Periyasamy Sudhakar 
    Abstract: Cryptography in Network security is the most important approach for secure communication. Demonstrating security is well experimented by utilizing the RSA algorithm is commonly used in efficient cryptographic mechanisms. RSA algorithm is used to monitor the scenario involving hackers and to change the way of transpositions. The original RSA crypto mechanism is needed to perform the behavioural characteristics of multi-privacy system and for exploring specified research strengthening technique. In this methodology the user uses the RSA (DPSP) algorithmic program and generates dual prime pairs for the encrypted messages that are sorted priority-wise to measure the respective of accurate derived system, translate and rotate the intractable algorithm to obtain essential security enhancement.This methodology reduces the danger of man-in-middle attacks and temporal arrangement attacks. Because the encrypted and decrypted messages are additionally disordered to support their priority. RSA (DPSP) algorithm is mainly applied for distributing the data with different environment and variety of approaches is available to implement computation and consumption of designing algorithm. To perform the process with real time data, the cryptographic encryption algorithm along with RSA crypto algorithm is used. Introducing secure RSA (DPSP) for secure file transmission, since there are several cases where we would like secure file transmission to avoid any kind of attack from intruders to process the location with approachable positions to represent with proper identifications of a system. In RSA (DPSP) algorithm the most important key representation is symmetrical random key of crypto mechanism. We apply the mechanism to enhance the effect for confidentially transferring the data and for managing different sizes of messages using time complexity methodologies. It approaches the sizes of the messages and the size of the key proposed by prime numbers.
    Keywords: Cryptography; RSA algorithm secured protocol; file transmission; nodes; ns2 tool; Priority programming;.
    DOI: 10.1504/IJAIP.2018.10020057
    by Pijush Samui, Aditi Palsapure, Sanjiban Roy 
    Abstract: Foundation settlement is an important design criterion as it affects the durability of a structure. Conventional methodologies calculate only a global factor of safety to determine the safety of the structure. However this does not account for the uncertainties due to soil variability and measurement errors. Therefore reliability based design principles must be incorporated to determine the performance and reliability of a structure. The First Order Second Moment Method (FOSM) is generally used for this analysis but it is time consuming. On the other hand, Relevance Vector Machine (RVM) achieves very good generalization performance. Thus in our study we have used RVM based FOSM and ELM and compared the results obtained from both. For this, a dataset of 480 readings was developed for cohesive frictional soil taking Poissons ratio and elastic modulus parameters as random variables. 70% of the readings were used for training and 30% were used for testing. Normalised data was used. Additionally, several error and correlation functions were also calculated to assess the performance of the models.
    Keywords: settlement; Reliability analysis; FOSM; RVM; ELM.

  • Dynamic Service Oriented Resource Allocation system for Interworking Broadband Networks   Order a copy of this article
    by Kokila Subramanian, Sivaradje Gopalakrishnan 
    Abstract: Optimizing available radio resource efficiently to the diverse traffic categories in a Heterogeneous Interworking Network, is the key issue of Radio Resource Management (RRM). In this paper, an advanced RRM method specified as, Dynamic Application Centric Resources Provisioning Algorithm (DAC-RP), to provide users with a dedicated set of suitable channels to real- time (RT) and non-real- time (NRT) services based on bandwidth conditions, to maximize the capacity with satisfied QoS constraints is proposed. The DAC-RP is realized over an Ultra Mobile Broadband (UMB) -Worldwide Interoperability for Microwave Access (WiMAX) Wireless Local Area Network (WLAN) hybrid interworking network, linked over a novel Intelligent Internet Protocol (IIP) architecture. IIP, an unified architecture, obtained by merging IMS Call Session Control Functions (CSCFs), Application services, enhanced IMS and centralized services, under a single layer with a common set of control and routing functions, to converge heterogeneous protocols, functional entities and applications. The competency of the IIP and DAC-RP is validated by comparing the performance metrics of the RT and NRT applications, simulated for IIP based UMB-WiMAX-WLAN network developed using OPNET with the scenario using existing IMS and with UMTS-WiMAX-WLAN network.
    Keywords: Radio Resource Provisioning; Quality of Service; Broadband wireless network; absolute partition; Heterogeneous network; Call Control layer; Real- Time; Non-Real-Time Application; IP Multimedia Subsystem.

  • Image Denoising using Fast Non Local Means Filter and Multi-Thresholding with Harmony Search Algorithm for WSN   Order a copy of this article
    by Rekha Haridoss, Samundiswary Punniakodi 
    Abstract: Image denoising is one of the challenging tasks in Wireless Sensor Network (WSN). Several image denoising algorithms are developed so far to obtain a better denoised sensor images. But they fail to preserve the edges of the images because of spatial averaging. In order to overcome the loss of image edges, an attempt has been made in this paper by incorporating the various filters like Fast Non Local Means Filter (FNLMF) and high boost filter with the existing wavelet thresholding based denoising method. However, the denoised output and the computation time are affected by the wavelet properties significantly. Hence, instead of using wavelet thresholding, this paper concentrates on Histogram based Multi-Thresholding (HMT) as a major part to denoise the image. Here, the corrupted image is first denoised by applying FNLMF filtering section. Then the edges and minimal details of the denoised output are enhanced by utilizing the HMT with Harmony Search Algorithm (HSA) based optimization technique. Further, various images with different noise deviations are considered in order to evaluate the performance of the proposed method by using MATLAB simulation. The simulation results indicates that the proposed method shows better results in terms of Peak Signal to Noise Ratio (PSNR), Image Quality Index (IQI) and computation time than that of existing method.
    Keywords: Denoising; Multi-Thresholding; Bilateral Filtering; Non Local Means Filter; Harmony Search Algorithm.

  • Reduction of jitter in 3D video by transmitting over multiple network Paths   Order a copy of this article
    by Vishwa Kiran, Raghuram Shivram, Thriveni J, Venugopal K R 
    Abstract: Stereoscopic video transmission in telemedicine application requires data to be transferred with minimal jitter. It is not possible to send stereoscopic video at full HD rate on single Internet Service Providers (ISPs) as the bandwidth becomes a bottle-neck and congestion can lead to packet drops, eventually leading to jitter in a video. This could be circumvented by employing multiple ISPs to stream stereoscopic video utilizing multiple Real Time Packets (RTPs) sessions. Usage of multiple ISPs results in multiple network paths between video streaming device and video consumers. This concept effectively involves aggregation of bandwidth, delay, jitter, packet loss, and other qualitative network attributes with respect to every ISP participating in the video transmission process. This article analyses through simulation collective delay and jitter which affects the video reconstruction process and concludes with the estimation of minimum qualitative network parameters required.
    Keywords: 3D Video; Bandwidth; Cloud Aggregation Server; Discrete Event Simulator; ISP; jitter; Multipath; Multiple ISPs; Simpy Simulator; Stereoscopic Video;.

  • Some Applications Of Vague Sets   Order a copy of this article
    by Hossein Rashmanlou, Kishore Kumar Krishna, S. Firouzian, Mostafa Noori 
    Abstract: In this paper, we gave a concise note on vague fuzzy sets. We present two applications on vague sets namely an application of vague fuzzy sets in career determination using an assumed data. The application was conducted with the aid of a new distance measure of vague fuzzy sets. Also the second one deals with research questionnaire construction, filling, analysis, and interpretation is given. Respondents decision is obtained assuming questionnaire is distributed among respondents. The respondents decision is converted into vague data set, analysed, and from which interpretation is drawn.
    Keywords: Vague set; fuzzy set; distance measure; hesitancy.

  • Domination in Hesitancy Fuzzy Graphs   Order a copy of this article
    by R. Jahir Hussain, S. Satham Hussain, Sankar Sahoo, Madhumangal Pal 
    Abstract: Hesitant fuzzy sets (HFS) are introduced by author Torra which is a novel and recent extension of fuzzy sets that aims to model the uncertainty originated by the hesitation to arise in the assignment of membership degrees of the elements to a fuzzy set. Hesitancy Fuzzy Graphs (HFG) is introduced to capture the common intricacy that occurs during a selection of membership degree of an element from some possible values that makes one to hesitate. HFG are used to choose a Time Minimized Emergency Route (TiMER) to transport accident victims. This paper addresses the study of domination in hesitancy fuzzy graphs. By using the concept of strength of a path, strength of connectedness and strong arc, domination set is established. The necessary and sufficient condition for the minimum domination set of HFG is investigated. Further some properties of independent domination number of HFG are obtained and the proposed concepts are described with suitable examples.
    Keywords: Domination number; Hesitancy fuzzy graphs; Independent domination set; Necessary and sufficient condition; Strong arc.

  • Trust-based-Tuning of Bayesian watchdog Intrusion Detection for Fast and improved Detection of Black Hole Attacks in Mobile Adhoc Networks   Order a copy of this article
    by Ruchi Makani, B.V.R. Reddy 
    Abstract: The Watchdog is a well-known intrusion detection mechanism for Mobile Adhoc Networks (MANET), which is not only monitor the traffic between peer nodes but also perform analysis on the data to discern malicious activity and it has been widely adopted for detecting black-hole attacks. Watchdog suffers from serious limitations viz. high number of false positive/negative. Integration of the Bayesian filtering in watchdog, improves performance in terms of enhanced data throughput, speed in detection of attacks and accuracy in reporting malicious activity. The Bayesianwatchdog capability can be further enhanced by its effective tuning. This paper presents the concept of trust based tuning of the Bayesianwatchdog, which is a novel approach towards enhancing the detection speed, eliminating false alarms and improving data throughput. The proposed trust based tuning of Bayesian watchdog, has been evaluated through simulations and encouraging results have been obtained to support the proposed approach.
    Keywords: Bayesian; Intrusion Detection; MANET; Trust; Watchdog.
    DOI: 10.1504/IJAIP.2021.10034313
  • Multi-objective Artificial Bee Colony Algorithm in Redundancy Allocation Problem   Order a copy of this article
    by Monalisa Panda, Satchidananda Dehuri, Alok Jagadev 
    Abstract: This paper presents an empirical study of uncovering Pareto fronts by multi-objective artificial bee colony for redundancy allocation problem (RAP). Multi-objective artificial bee colony has been successfully applied in many optimization problems; however, a very little effort has been extended towards solving RAP. In this work, we have considered simultaneous optimization of the unavoidable objectives that are maximization of reliability, minimization of cost, and minimization of weight in a series parallel system, which leads to a multiple objective redundancy allocation problem (MORAP). The objective of this paper is to uncover true Pareto fronts populated with non-dominated solution sets as a solution to MORAP using multi-objective artificial bee colony algorithm (MOABC). Two MOABC algorithms have been developed and are inspired from the popular and established multi-objective genetic algorithms like Vector Evaluated Genetic Algorithm (VEGA) and Non-dominated Sorting Genetic Algorithm II (NSGA II). We named these two algorithms as MOABC-I and MOABC-II, respectively. From the experimental results, we visualize that the approximation of true Pareto front by MOABC-II is better than Pareto front obtained through MOABC-I. Further this resultant Pareto fronts are supervised by two inherent multi-criterion decision making (MCDM) methods like Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) and Analytical hierarchy process (AHP) to reach at a definite goal.
    Keywords: Redundancy allocation problem; Genetic algorithms; Multi-objective optimization; Artificial bee colony; Multi-objective artificial bee colony; Multi-criteria decision making.

  • Cost Effective Hybrid Genetic Algorithm for Scheduling Scientific Workflows in Cloud under Deadline Constraint   Order a copy of this article
    by Gursleen Kaur, Mala Kalra 
    Abstract: Cloud has emerged as a convenient platform for executing complicated scientific applications from multiple disciplines by providing on-demand and scalable infrastructure on rental basis. Research and scientific community often opt for workflows to model these scientific applications Workflow scheduling has been extensively studied for decades with regard to grid and cluster computing, but few initiatives have been tailored for cloud. Whats more, the previous work fails to incorporate the basic principles of IaaS clouds like pay-as-you-go model, elasticity, heterogeneity, dynamic provisioning, and issues of VMs performance variation and acquisition delay besides other QoS requirements. This paper proposes a resource provisioning and scheduling strategy using genetic algorithm with the aim to optimize the overall execution cost while staying below the given deadline. The performance is further enhanced by using a high quality seed generated by Predict Earliest Finish Time (PEFT) algorithm which acts as a catalyst and helps the algorithm to converge faster. The proposed approach is simulated in WorkflowSim and evaluated using various well-known different sized realistic scientific workflows. The results validate the better performance of our approach over numerous state-of-art-algorithms.
    Keywords: Cloud Computing; Workflow Scheduling; PEFT; Genetic Algorithm; Time-Cost trade off; Dynamic Resource Provisioning.
    DOI: 10.1504/IJAIP.2018.10023063
  • Multi-Key Searchable Encryption Technique For Index - Based Searching   Order a copy of this article
    by Putti Srivani, Sirandas Ramachandram, Rangu Sridevi 
    Abstract: Multi - key searchable encryption scheme enables keyword search on encrypted data with different keys. The scheme is practical to apply for client server applications to achieve data confidentiality and makes the server to perform the search operation on the encrypted data. So far, this algorithm can be implemented for sequential search. This paper presents an improved version of multi key searchable encryption algorithm implemented for index based searching and also shows the experimental results of index based multi key searchable encryption scheme implemented using C and pbc library. This research uses ECC technique (Elliptic Curve Cryptography) for key improving security. This ECC technique contains the sequence of steps for a secure key generation by the user using hash function. Through the use of the hash function in ECC, performance is enhanced for index-based searching. Experimental results show the execution time of an improved version of Multi key searchable scheme for index based search constructed for different elliptic curves. An application is also designed by using new scheme to perform the search on one lakh encrypted collections with java as front end and MongoDB as back end.
    Keywords: encryption; search token; delta; token; client; server; confidentiality; searchable ; multi key; index; public key; elliptic curve cryptography.
    DOI: 10.1504/IJAIP.2018.10018843
  • Connectivity Analysis of Multihop Wireless Networks Using Route Distribution Model   Order a copy of this article
    by Abdullah Waqas 
    Abstract: Most of the management and routing protocols in multihop wireless networks rely on strict connectivity condition among nodes. In this paper, we construct mathematical framework to model connectivity of wireless ad hoc and wireless sensor networks. We calculate mathematical expressions to find the distribution of the distance between the nodes which is used to calculate the transmission power required to establish connection between the nodes which are inside the communication circle of each other. Then we present Route Distribution Model (RDM) to establish routes between the source and the destination which are outside the communication range of each other. The results show that the transmission power required to establish a connected network depends on the number of nodes in the network as well as distribution of the nodes. The results are analyzed for low, medium, and high density networks for uniform and Poisson distributed nodes. The results show that a connected network is achieved at relatively lower transmission power if nodes establish multihop routes to transmit their data towards the destination.rn
    Keywords: Ad hoc networks; connectivity; minimum transmission range; node degree model; sensor networks.

  • Optimization of SPARQL queries over the RDF data in the Cloud Environment   Order a copy of this article
    by Ranichandra Dharmaraj, Tripathy B.K. 
    Abstract: The semantic web is built with the support of Resource Description framework (RDF). The changing faces of semantic web created the requirement of new approaches to store and query the RDF data. The RDF data contains large volume of data that have more number of binding. Processing the SPARQL queries over the RDF data in the cloud creates some challenges. The network cost and query processing time majorly impacts the performance of queries over the cloud. This paper proposed an optimization algorithm for query processing in the large datasets. The proposed algorithm considered the parallel execution of queries as a major objective to reduce the network cost as well as to minimize the response time of the query. The experimental evaluation is carried using the LUBM 400 university dataset along with the hardware rented with amazon web services. The proposed algorithm proved their efficiency in terms of reducing the query response time and minimizing the network traffic.
    Keywords: Query; SPARQL; RDF data; Response time; Distributed cloud.

  • Privacy Preserving Using Diffie-Hellman and An Envelope Protocol Through Key Handling Techniques In Cloud Storage   Order a copy of this article
    by K. Santhi Sri, N. Veeranjaneyulu 
    Abstract: Cloud computing is unremittingly advancing and demonstrating reliable development in the arena of computing. It is in receipt of fame by giving distinctive computing administrations as distributed storage, cloud facilitating, and cloud servers and so forth for various sorts of enterprises and in addition in scholastics. On the opposite side there are heaps of issues identified with the cloud security and protection. Security is as yet basic test in the distributed computing worldview. These difficulties incorporate client\'s mystery information misfortune, information spillage and revealing of the individual information security. In view of the security and protection inside the cloud there are different vulnerabilities to the client\'s sensitive information on cloud storage. In this paper we are considering the risk of storing the data in cloud from third party and accessing the stored data by cloud users we are proposing a novel mechanism that will give the confidence to cloud user about the security from third party that is cloud service provider and also providing privacy to cloud data users using efficient group key management schema.
    Keywords: cloud computing; Privacy Preserving; Data owner; Cloud User; key.
    DOI: 10.1504/IJAIP.2018.10014407
  • Certain graph parameters in bipolar fuzzy environment   Order a copy of this article
    by Ganesh Ghorai, Sankar Sahoo, Madhumangal Pal 
    Abstract: Yang et. al [16] introduced the concept of generalized bipolar fuzzy graphs in 2013. In this paper, we have introduced certain concepts of covering, matching and paired domination using strong arcs in bipolar fuzzy graphs with suitable examples. We investigated some properties of them. Also, we have calculated strong node covering number, strong independent number and other parameters of complete and complete bipartite bipolar fuzzy graphs.
    Keywords: Bipolar fuzzy graphs; strong arcs; covering; matching; paired domination.
    DOI: 10.1504/IJAIP.2018.10024070
  • Teaching Learning Based Optimization for Job Scheduling in Computational Grids   Order a copy of this article
    by Tarun Kumar Ghosh, Sanjoy Das 
    Abstract: Grid computing is a framework that enables the sharing, selection and aggregation of geographically distributed resources dynamically to meet the current and growing computational demands. Job scheduling is the key issue of Grid computing and its algorithm has a direct effect on the performance of the whole system. Because of distributed heterogeneous nature of resources, the job scheduling in computational Grid is an NP-complete problem. Thus, the use of meta-heuristic is more appropriate option in obtaining optimal results. In this paper, a recent Teaching Learning Based Optimization (TLBO) is proposed to solve job scheduling problem in computational Grid system with minimization of makespan, processing cost and job failure rate, and maximization of resource utilization criteria. In order to measure the efficacy of proposed TLBO, Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) are considered for comparison. The comparative results exhibit that the proposed TLBO technique outperforms other two algorithms.
    Keywords: Computational Grid; Job Scheduling; Makespan; Processing Cost; Fault Rate; Resource Utilization; GA; PSO; TLBO.
    DOI: 10.1504/IJAIP.2018.10026530
  • On the learning machine in quaternionic domain and its application   Order a copy of this article
    by Sushil Kumar, Bipin Kumar Tripathi 
    Abstract: There are various high-dimensional engineering and scientific applications in communication, control, robotics, computer vision, biometrics, etc.; where researchers are facing problem to design an intelligent and robust neural system which can process higher dimensional information efficiently. In various literatures, the conventional real-valued neural networks are tried to solve the problem associated with high-dimensional parameters, but the required network structure possesses high complexity and are very time consuming and weak to noise. These networks are also not able to learn magnitude and phase values simultaneously in space. The quaternion is the number, which possesses the magnitude in all four directions and phase information is embedded within it. This paper presents a learning machine with a quaternionic domain neural network that can finely process magnitude and phase information of high dimension data without any hassle. The learning and generalization capability of the proposed learning machine is presented through 3D linear transformations, 3D face recognition and chaotic time series predictions (Lorenz system and Chuas circuit) as benchmark problems,which demonstrate the significance of the work.
    Keywords: Quaternion; quaternionic domain neural network; 3D motion; 3D imaging; time series prediction.
    DOI: 10.1504/IJAIP.2018.10013702
  • Categorization of Random Images into fog and blur based on the Statistical Analysis   Order a copy of this article
    by Monika Verma, Vandana Dixit Kaushik, Vinay Pathak 
    Abstract: : Noisy images are a bottleneck to solve the image processing problems. The present paper aims to classify images as different types of foggy and blurry images. A feature based classifier called FB Classifier has been proposed. Given an image the classifier is able to tell whether the image is clear or unclear, which type of distortion is there, either foggy or blurry and also the categories of different types of blur and fog. The quality of the images taken through any equipment depends on few factors: 1. Medium in which the photograph is taken, 2. the movements of either the camera or the object or movement of both, 3. the quality of the equipment that is used for capturing. All the algorithms of classification or the removal of distortions are made to handle the above three scenarios. The three factors encompass all types of foggy or the blurry images. The images viewed are given different threshold values according to their properties and finally the cumulative threshold value decides which type of the image is it. The algorithm is simple to implement yet it is comparable to the state of art methods.
    Keywords: statistical analysis; classifier; categorization; point spread function; cumulative probability of blur detection; eccentricity; textured segments; deblurring.

  • Aspect based summarisation in the big data environment   Order a copy of this article
    by K. Krishnakumari, E. Sivasankar 
    Abstract: Due to the large amounts of information available, it is difficult for customers to select a superior product. Reviews of shopping sites may confuse the customer when purchasing a product. With the large volume of information, it is difficult for customers to assess all of the reviews. Sentiment analysis plays an active role in extracting and identifying the opinion of the customer who purchased the product. Sentiment summarization helps the customer to buy the best product based on its features and values. Our technique involves aspect based sentiment analysis followed by summarization. The size of the datasets analyzed is huge and cannot be handled by traditional single machine systems. To handle large datasets, we propose a parallel approach using the Hadoop cluster to extract features and opinions. By referring to an online sentiment dictionary and Interaction Information(IIn) method, the sentiments are predicted and then summarized using clustering. After classifying each opinion word, our summarization system generates a short summary of the product based on several features. This makes the customer feel comfortable and improves the competitive intelligence.
    Keywords: Sentiment summarization; Opinion; Aspects; Hadoop; MapReduce; big data.
    DOI: 10.1504/IJAIP.2018.10023308
  • Efficient Video Transmission Technique using Clustering and Optimization Algorithms in MANETs   Order a copy of this article
    by G.N. Vivekananda, P. Chenna Reddy 
    Abstract: Mobile Ad hoc Networks (MANETs) are the infrastructure less wireless networks that can configure themselves, and works on the upper part of a link layer. For the transmission of video, along with the high bandwidth requirements, we need tight delay constraints, and for the continuous media streaming, packets must be delivered in timely fashion. The network may get frequently congested due to the external traffic. Many network designs are not able to provide an optimised solution for the layers to adopt particular application requirements and underlying channel conditions. To overcome these issues, our proposed method is utilized. Here initially at the sender side, the input video is partitioned into some frames. The Discrete Wavelet Transform (DWT) is applied to the frames to decompose the frames into sub-band, and then quantization is done to obtain the bit stream from subgroups. The bit stream is used for video transmission purpose with the Stream Control Transmission Protocol (SCTP) multi-steaming. The cross-layer mechanism is used in a proposed method to promote rapid video transmission. For the process of clustering the available nodes, we use Enhanced Fuzzy C Means algorithm (EFCM). The path selection for transmitting video stream is made using Enhanced Cuckoo Search (ECS) algorithm. At the receiver side reconstruction of frames is made possible by performing an inverse process of transmission section. Hence the original video tolerates the internal congestion and is viewed as the same original video on the receiver side without any distortion. Hence our proposed method has better video streaming performance. The proposed technique is assessed using delay, delivery ratio, overhead, throughput, and energy consumption by varying the number of nodes and rates.
    Keywords: Clustering; MANETs; Optimization; SCTP; Video Streaming.
    DOI: 10.1504/IJAIP.2021.10022050
  • On Lifetime Enhancement of Wireless Sensor Network using Particle Swarm Optimization   Order a copy of this article
    by Ashish Pandey, Shashank Shekhar, Arnab Nandi, Banani Basu 
    Abstract: In this article, a multi-dimensional, multi-objective optimization method to manage the network as per specific requirement while considering the constraints of Wireless Sensor Networks (WSNs) is studied. A particle swarm optimization (PSO) based technique is used to address the energy management and lifetime issues. A predefined percentage of nodes are assumed as supernodes having higher energy than ordinary nodes. Supernodes are used as cluster head to enhance the lifetime of the network. Free space and fading channel models are considered to evaluate the performance of the PSO based WSNs. The location of the sink node and node density are varied to study the effect on network performance matrices. The population of supernodes are also varied to demonstrate their effect on the network while keeping all the other parameters unchanged. Energy consumed by each clusterhead (CH) is also studied to observe the load distribution among the CHs.
    Keywords: WSNs; Wireless sensor networks; cluster; supernodes; PSO.

  • A Novel Map Matching Algorithm: for Real Time Location using Low Frequency Floating Trajectory Data   Order a copy of this article
    by Kanta Prasad Sharma, Ramesh C. Poonia, Surendra Sunda 
    Abstract: The continuous enhancement of technologies and modern well-equipped infrastructure are necessary for easy life. The road accident and missing vehicle ratio are very challenging for preventing misshapenig because these are continually increasing due to traffic hazards. The single way to protect human life from such type of conditions that is more reliable navigation services such as correct location tracking of vehicles on the road network. The real-time location tracking methods fully depend on the map matching algorithms which also compute a reliable path on the road network. A smart vehicle can provide more reliable tracking services during or before any misshaping using proposed map matching algorithm. rnThis work contributes to ensure correct location for necessary action during misshaping, alert accident zone and communicate messages without wasting valuable time. The proposed approach is validated on the real tracking data and is compared to poor GPS services
    Keywords: Map Matching;rnConfidence level;rnGAGAN;rnGPS trajectory; rnPerpendicular Distance;rnEuclidean distance;rnSignal Frequency;rnKalman Fiilter;rnClustersrn.
    DOI: 10.1504/IJAIP.2018.10017590
  • A Hybrid Approach for Deep Belief Networks and Whale Optimization Algorithm to Perform Sentiment Analysis for MOOC Courses   Order a copy of this article
    by Jayakumar Sadhasivam, Ramesh Babu Kalivaradhan 
    Abstract: Sentiment classification has won significant attention presently, as it provides the way for automatic analysis of peoples reviews to extract user information regarding a product or service. One of the widely used techniques is polarity classification that determines the polarity of the texts in the opinion. Accordingly, this paper presents a technique for sentiment classification of online course reviews using a novel classifier, Whale-based Deep Belief Network (WDBN). In the proposed technique, the input course review data is pre-processed, and important features are extracted from the data using Emotion-SentiWordNet based feature extraction process. For the classification of sentiments in the feature extracted data, WDBN is introduced by combining Deep Belief Networks (DBN) and Whale Optimization Algorithm (WOA) such that the weights of the network layers are selected optimally. The proposed technique, with the utilization of WDBN, classifies the course reviews into two classes, such as positive and negative class reviews. The proposed WDBN classifier is experimented using a publicly available online course review dataset and the performance of the classifier is evaluated using three metrics, such as sensitivity, specificity, and accuracy.
    Keywords: Deep Belief Networks (DBN); MOOC; Sentiment classification; SentiWordNet; Whale Optimization Algorithm (WOA); Neural Network.

    by N. Vijayaraj, T. Senthil Murugan 
    Abstract: Cloud provides the different types of resource based on the user requirements. Cloud User send the requirements and needed resources to the Cloud service provider (CSP) and CSP allocate different level of service and resource based on the user requirement resource and investment. Thus users and providers pitch into Auction, which is one of the interesting resource allocation scenarios that have also become a major area of research in recent trends. The key behind auction is demand and supply scheme based on cloud user requirement and cloud service provider. One of the major parts of the Double auction with Winner Determination Problem (WDP), since determining the winners are the hard combinational problem. The proposed scenario also concentrates on the Quality of Service (QoS) in Cloud Service Providers (CSP) and cloud user. As stated in the statement of problem, an effective multi attribute Combinational double auction, On-demand bidding strategy is also proposed and multi-round bidding strategy is imposed an effective imposition of penalty and compensation is proposed, where Imposition of Penalty/Compensation Strategy (IP-CS) for falling to provide the promised QoS is proposed. Imposition of Penalty/Compensation Strategy (IP-CS) simulated using cloudsim and java based simulator for simulating cloud environment. The result satisfied the double auction with multi-round bidding and Imposition of Penalty/Compensation Strategy (IP-CS).
    Keywords: CSP; WDP; Double Auctioneer.
    DOI: 10.1504/IJAIP.2021.10016178
  • New Concepts of Product Vague Graphs with Applications   Order a copy of this article
    by Hossein Rashmanlou, Kishore Kumar Krishna, S. Lavanya, Ali Asghar Talebi 
    Abstract: It is known that vague models give more precision, rnexibility andrncompatibility to the system as compare to the classic and fuzzy models.rnVague graph has an important role in neural networks, computer network, andrnclustering. In the design of a network, it is important to analyse connectionsrnby the levels. The structural properties of vague graphs provide a tool thatrnidenti es a solution to operations research problems.In this paper, we de nernring sum of two product vague graph and analyse some interesting propertiesrnof isomorphism on product vague graphs.
    Keywords: Ring sum; direct product; product vaguerngraphs.rn.

  • Swarm Intelligence for a Single Source Product Distribution   Order a copy of this article
    by Surafel Tilahun 
    Abstract: Distributing products under different constraint set is one of the challenging \r\ntask in industries. In this paper a single source resource distribution to different \r\ncenters using finite and small number of transportation system is discussed. For \r\na productions center, the source, there could be multiple centers to transport the \r\nproducts using trucks and each truck should have equal load in terms of centers which \r\nthey are going to distribute the products to. The problem can be seen as a set of \r\nmultiple traveling salesman problem. This problem will be formulated and a custom \r\nmade swarm intelligence algorithm based on prey predator algorithm will be used. \r\nThree data sates of different categories including cases when there is a restriction to \r\ngo from one center to another is generated and used to test the algorithm. The data \r\nsets are also attached in the appendix for future research and comparison.
    Keywords: Swarm intelligence; product distribution; travelling salesman problem; prey predator algorithm.

  • Prediction of Exchange Rate Using Improved Particle Swarm Optimized Radial Basis Function Networks   Order a copy of this article
    by Trilok Pandey, Satchidananda Dehuri, Alok Jagadev 
    Abstract: In this paper, a radial basis function neural network (RBFN) model has been trained by canonical particle swarm optimization (PSO) and improved particle swarm optimization (IMPSO) algorithms to efficiently predict the exchange rate of Indian rupees against the exchange rate of G-7 countries for future days. We have used two variants of PSO such as canonical PSO and IMPSO for optimizing the parameters of radial basis function neural network through learning from the past data of exchange rate prediction. Here, we have considered forty three country's exchange rates to predict the Indian rupees against the G-7 countries. Forty three exchange rates have been collected and based on their correlation analysis a dataset has been prepared to validate the proposed model. In addition, a fair comparison has been carried out between IMPSO tuned RBFN and canonical PSO tuned RBFN with respect to the results obtained by varying the number of iterations for future days prediction. From the experimental results, it is observed that the predictive performance of IMPSO tuned RBFN model in the case of higher number of iterations is promising vis-
    Keywords: radial basis function network; neural network; radial basis function; canonical particle swarm optimization; improved particle swarm optimization model; exchange rate.

  • Software Fault Prediction Using Hybrid Swarm Intelligent Cuckoo and Bat based k-means++ Clustering Technique   Order a copy of this article
    by Shruti Aggarwal, Paramvir Singh 
    Abstract: k-means and its various hybrids are popularly used for software fault prediction. k-means++ is a hybrid clustering algorithm which overcomes major issue of getting stuck at local optima. Here, in this paper swarm intelligence based hybrid techniques viz Cuckoo Algorithm which improves the fitness function and Bat Algorithm which swarms with varying speeds; are used on k-means++ algorithm to design a new hybrid clustering technique. KBat++ algorithm is a designed hybrid clustering technique with increased convergence rate which is further improvised using robust Cuckoo swarm intelligent technique on this designed algorithm to generate CKBat++ Algorithm which is predicted to generate optimized high quality clusters. Experiments are performed using open source UCI and Promise datasets to implement and compare performance of designed algorithms with KBat and k-means++ algorithms. Accuracy, cluster quality check, CPU Time etc. are used for performance comparisons. Results indicate that the designed technique which is used to predict and categorize software faults into faulty and non-faulty clusters to avoid errors and increase software reliability, is fairly better in performance than its counterparts.
    Keywords: Fault prediction; clustering; swarm intelligence; Cuckoo Algorithm; Bat Algorithm; k-means; k-means++ algorithm.
    DOI: 10.1504/IJAIP.2021.10016288
  • Multi-Resolution Image Fusion With Regularization Framework Using Enhanced Gabor Preceding Method   Order a copy of this article
    by Ravikanth Garladinne, K.V.N. Sunitha, B. Eswara Ruddy 
    Abstract: Image Fusion is characterized as the way toward consolidating applicable data from at least two pictures caught utilizing diverse sensors, keeping in mind the end goal to create a solitary yield picture with an engaging visual recognition. It is thought to be a standout amongst the most intense apparatuses changing the field of picture handling in different zones like prescription, space science, resistance et cetera. The transformation as far as photography utilizing satellite made it conceivable to see the world's surface without being in contact with the range of intrigue. Over the most recent four decades, the progression of the remote detecting innovations have enhanced the strategies for accumulation, preparing and examination of the information. Numerous specialists have utilized the model based methodologies for combination with the accentuation on enhancing the intertwined picture quality and diminishing the shading twisting. In a model based technique, the low determination Multi otherworldly picture is displayed as the obscured and loud form of its optimal high determination intertwined picture. Since this issue is poorly postured, it expects regularization to get the last arrangement. In the proposed show based approach a learning based technique that utilizations photogenic information is utilized to get the required corruption framework that records for associating. At that point utilizing the proposed display, the last arrangement is gotten by taking care of the reverse issue where a Markov irregular field smoothness going before is utilized for regularizing the arrangement. Keeping in mind the end goal to better save the spatial points of interest and to enhance the gauge of combined picture, we take care of the multi-determination combination issue in a regularization structure by making utilization of another previous called Enhanced Gabor going before . Utilization of Enhanced Gabor going before guarantees highlights at various spatial frequencies of intertwined picture to coordinate those of the accessible HR Photogenic picture. Alongside Enhanced Gabor going before strategy we additionally incorporate a MRF going before which keeps up the spatial correlatedness among the HR pixels.
    Keywords: Resolution;fusion problem;image fusion;image enhancement;remote sensing;.

    by Lakshman Narayana 
    Abstract: In the course of expansion in mobiles, quantity of hubs in the Mobile Ad-hoc Networks (MANETs) must be expanded. The MANETs being active in nature and it starts issues in deciding the most ideal course for packets. Besides the packets may confront overabundance movement and clog in the system which corrupt the execution of general system and influencing the situation to most exceedingly bad these issues even prompt packet losses. Int his paper an attempt to propose another directing convention which consolidates the properties of both static and dynamic steering convention and from that point tries to take out the issues innate in the system through thickness based directing is done. The paper concentrates on efficient route discovery process for secure data transfer with less ratio of packet loss.The examination mostly concentrates on the normal movement of the system and in the wake of breaking down it the packet is given a way from source to goal which is less congested. For the minimization of assault and packet dropping different creators manufactured different technique such strategy is hub validation, aloof criticism conspire, ack - based method, status based plan and motivating force dependent plan, ack - based plan endured an issue of monstrous overhead because of additional affirmation packet and it likewise endured choice equivocalness if asked for hub decline to send back Acknowledgment. In this paper we utilizes 2 ack - based plan utilizing protected channel for conquering the issue of choice equivocalness for asked for hub, enhanced hub confirmation and limit packet dropping in adhoc arrange.
    Keywords: Dynamic routing; MANETs; Traffic analysis;packet loss reduction;2-ack method.

  • An efficient Quantum Hash based CP-ABE framework on Cloud Storage data   Order a copy of this article
    by Kranthi Kumar Singamaneni, P. Sanyasi Naidu 
    Abstract: With the exponential growth of cloud data and storage space, cloud security has become one of the interesting research area of cloud computing servers. Attribute based encryption is a public key encryption algorithm that allows cloud users to secure their sensitive information in the public cloud servers. Quantum key Distribution (QKD) is required to improve the security of communication systems. Quantum Cryptographic scheme completely depends on quantum mechanics. The major objective of quantum key distribution is to generate a key that takes part in encryption. Traditional attribute based encryption models are insecure and possible of key distribution attacks using man-in-the-middle attacks. Also, as the size of the input data increases, traditional ABE models failed to compute efficient secret key due to computational time and network overhead. To overcome these issues, a novel chaotic integrity and quantum key distribution (QKD) based cipher text policy ABE model is implemented in cloud environment. Experimental results proved that the proposed model has high computation speed, storage overhead and secured key distribution compared to traditional CPABE, KPABE and QKD-ABE models.
    Keywords: ABE;CPABE;Quantum distribution;Data security;Cloud computing.
    DOI: 10.1504/IJAIP.2021.10033337
  • A State-of-the-Art Neuro-Swarm Approach for Prediction of Software Reliability   Order a copy of this article
    by Ajit Behera, Ch. Sanjeev Dash, Mrutyunjaya Panda, Satchidananda Dehuri, Rajib Mall 
    Abstract: Software reliability is one of the foremost factors to assess the quality of software. It is evident from the past research that not a single general model has been developed in the arena of software reliability research to predict the reliability of software. Therefore, lots of attempt is continuously made from different corners of diversity to make a generic and widely acceptable model. In this paper, we propose a neuro-swarm software reliability model by combining the best attributes of functional link artificial neural network (FLANN) and Particle swarm optimization (PSO). FLANNs have been successfully employed to solve non-linear regression and time series problems; however, its application in software reliability is rare. This intensive work elucidates the feasibility of the use of FLANNs to predict software reliability. PSO is used to tune the parameters of FLANN during the development of the model. The extensive experimental study on a few benchmarking software reliability datasets reveals that the FLANN model with Particle swarm optimization (PSO-FLANN) results in better prediction than the methods like Back-propagation Neural Network (BPNN), Dynamic Evolving Neuro-Fuzzy Inference System (DENFIS), Non-linear Ensemble Back Propagation Neural Network (NEBPNN), and canonical FLANN. Hence, the proposed model may be a suitable and promising alternative for predicting software reliability.
    Keywords: Software reliability; Functional link artificial neural network; Particle swarm optimization; Normalized Root Mean Square Error.

  • Genetic Algorithm based Rule Generation for Approximate Keyword Search   Order a copy of this article
    by Priya Mani, Kalpana R 
    Abstract: A lot of problems in natural language processing, data mining, information retrieval, and bioinformatics can be legitimated as trying transformation. The task of the string transformation is once the input string is given, the system generates the k most likely occurring output strings resultant to the input string. The existing method for approximate keyword search based on rules uses two processes called learning and generation which provides the improvement in both accuracy and efficiency, but not to the expected level. A new genetic algorithm based approach to generate the rules is introduced to support pattern matching and the generated rules are learned by applying maximum a likelihood function in order to generate the rule dictionary. The given query keyword is searched in database by constructing tree based index called the aho corasick tree and perform the pattern matching with rule dictionary for retrieving the document even it contains misspelled string. The experimental result is very accuracy and efficiency when compared to existing methods
    Keywords: Bigram Dice Coefficient; Rule dictionary; Divide and Conquer; Error Correction; Maximum a likelihood.

  • Fractional order Control of Switched Reluctance Motor   Order a copy of this article
    by Sihem Ghoudelbourk, Ahmad Taher Azar, Djalel Dib, Abelkrim Rechach 
    Abstract: In recent years, the switched reluctance machine (SRM) represents a contemporary competitive technology in the fields of automotive and aeronautics, because their growing needs for electric drives with a frequent and important high-speed variation. This paper presents an application of the fractional order control of the speed of switched reluctance motor for its application used for electric and hybrid vehicles. It also presents a comparative study between the speed control with the fuzzy logic control (FLC) regulator and the fractional order proportional integral control. The numerical simulations showed that the speed control with the designed fractional order proportional integral control realize a better dynamic behavior of the motor, a better speed and a good accommodation to the disturbances of the load. The application of the fractional order proportional integral controller (FOPI) achieved a high performance and a longer lifetime to SRM than the results obtained by a fuzzy logic controller.
    Keywords: Fractional order Proportional integral controller (FOPI); Fuzzy Logic Control (FLC); speed control; Variable Reluctance Motor; Direct torque control (DTC).
    DOI: 10.1504/IJAIP.2018.10024488
  • Smart and Efficient IoT Based Quality Tracking System for Perishables Pertaining to Indian Conditions   Order a copy of this article
    by Shriram K V, Sriharsha P, Ikram Shah 
    Abstract: In most of the countries, the vegetables, fruits or any eatable (Could be a raw material as well) is cultivated at one place and being circulated to all the other places in the country through various modes of transport. For an example, most of the Vegetables are from Coimbatore and other parts of Tamilnadu are being cultivated and circulated from Ooty through truck, Lorries and other modes of transports. It is eventually understandable that, the transports of vegetables happen from one place to another and the same gets distributed to further interior places. In the process, the order in which the vegetables are stacked in the order in which the vegetables are stacked out. The first placed vegetable basket is the last one to go out, By that time it goes out, it could have been already spoiled and could be unusable. Also, there could be some vegetables that could withstand much more time to be inside the transport vehicle. Our innovation is aimed at identifying the vegetable basket which is most likely to be spoiled and delivering them in order of the chance that they may be spoiled earlier. Thereby delivering all the vegetable basket at appropriate times in an identified order without letting them go wasted. Our product has to be kept in the truck or Lorries cabin, by arranging multiple sensors and a micro-controller in order to track the record by placing tag number in each and every basket/ bag which will keep track of the basket/ bag even if it is transferred or shifted to other truck. Overall cost for this product is very minimal and will be a one-time investment for owners and also would ensure healthy vegetables being delivered to the customer.
    Keywords: Perishable tracking; IoT; Food quality monitoring; Food Products; Food wastage; Technology for quality monitoring; Food transport.

  • Enhanced Differential Evolution with Information Preserving Selection Strategy   Order a copy of this article
    by Pravesh Kumar 
    Abstract: In the present paper, two modifications for Differential Evolution (DE) algorithm are proposed. The first modification is the proposal of a new selection technique called Information Preserving Strategy (IPS), tries to preserve and utilize the important information about the search domain. The corresponding DE variant is called IpDE. The second modification is a new mutation strategy and called Enhance DE algorithm (EDE). Furthermore a new variant named IpEDE by combining EDE and IpEDE is also proposed. The performance of the proposed variants IpDE, EDE and IpEDE are validated on a set of test problems including standard test problems and the selected test problems of CEC-2008. The algorithms are compared with some of the prominent DE variants and it is observed that the proposed modifications help in improving the performance of DE in terms of convergence rate and solution quality.
    Keywords: Differential evolution; Information preserving selection; Mutation; Global Optimization.

  • Improving the Flexible Neural Tree model with Swarm Intelligence   Order a copy of this article
    by Tomas Burianek, Sebastian Basterrech 
    Abstract: A type of feedforward Neural Network with a specific architecture was developed around ten years ago under the name of Flexible Neural Tree (FNT). The model has two families of adjustable parameters: the parameters presented in the activation function of the neurons, and the topology of the tree. The method uses meta-heuristic algorithms for finding a good tree topology and the set of embedded parameters. The technique has been successfully applied for solving machine learning problems with time-series and sequential data. The canonical FNT was introduced with the radial basis function as activation function of the neurons. In this article, we analyze the performance of the FNT when different type of activation functions are presented in the tree. We present a comparative analysis among different type of neurons. We study the performance of the model when the following four types of neurons are used: Gaussian, hyperbolic tangent, Fermi function and a linear variation of Fermi function. The empirical analysis was made over a well-known simulated time-series benchmark and a real-world networking problem.
    Keywords: Feedforward Neural Network; Swarm Intelligence; Flexible Neural Tree; Time-series modeling; Forecasting.

  • Secure Data Transmission for Protecting the Users Privacy in Medical Internet of Things (M-IoT)   Order a copy of this article
    by Purushotham Jyotheeswari, N. Jeyanthi 
    Abstract: Internet of things (IoT) is the catchphrase in the recent years with transdisciplinary research. Medical Internet of Things (M-IoT) is the novel development in the fields of healthcare and information technology to store and retrieve the medical data that contains the sensitive information of the patients and heterogeneous medical data. To preserve the users privacy, we propose a secure data transmission mechanism for M-IoT. The proposed approach encompassesthree phases such as user authentication, symmetric key generation and disjoint multipath data transmission modules. In the first phase, gateways are validated with the cloud data servers. In the second phase, secret keys are generated for scrambling the message. Finally, the disjoint multipath data transmission divides the encrypted data into fragments and sends to the server. The experimental evaluation proved the efficiency of the proposed protocol in terms of reduced delay and response time at the upload and download of the medical data from the cloud servers.
    Keywords: Authentication; Wireless Medium; Medical Data; Privacy; Internet of Things.

  • Fuzzy Associated Trust Based Data Security in Cloud Computing by Mining User Behavior   Order a copy of this article
    Abstract: As of now cloud computing assumes an indispensable part in various areas. Despite the fact that cloud is adaptable and savvy, it has a few testing issues to be tended to. A portion of the fundamental issues are cloud security and protection. The proposed fuzzy based security mechanism enhances the security level of data storage in cloud by computing cloud users trustworthiness depending on their behavior. Reliability is assessed utilizing parameters that express user behavior such as transfer rate, bandwidth and number of bytes per second of data from service provider to user, time period of access to the cloud system, timings of user visit, and IP address used by user for cloud access. Cloud information is ensured by encoding utilizing key created in view of trust level of clients and their continuous access design. Frequent access pattern is detected by mining users past behavior using FP-Growth algorithm. Experiment results show that the proposed scheme withstands blackhole attack and offer higher packet delivery ratio.
    Keywords: Cloud computing;Security;Privacy;Trust;fuzzy analysis;pattern mining.

  • The Report of Questionnaire Survey on Privacy in Social Networking Websites   Order a copy of this article
    Abstract: Social Networking Sites (SNS) have become an important live information source. The large personal information available over SNS attracts the attention of corporate, business and marketing people. So, these people misuse the personal information of the users through different ways. This process leads to critical user concerns over their privacy. This paper tries to identify the factors which influence the privacy disclosure on SNS through a questionnaire. The main focus of this paper is to examine the behavior and privacy issues of the users on the SNS. It also attempts to analyze the survey results which illustrate interesting findings on the use of SNS and the awareness of personal data protection.
    Keywords: Privacy; Social Networking Sites; questionnaire survey; new findings.

  • A Novel Ant Colony Optimization Approach for Fretting Out Wormhole Attack in Mobile Ad Hoc Networks   Order a copy of this article
    by Ashutosh Sharma, Lokesh Tharani 
    Abstract: Ad-hoc networks are becoming prevalent day to day as for their viability and reasonable infrastructure. With the exponentially increasing esteem, ad-hoc networks are becoming vulnerable to various attacks. Wormhole attack is the most severe threat to the ad hoc networks, it can lure and bypass a large volume of traffic, enabling attacker to receive and wield networks traffic. This kind of entree to the network resources elevates invader to launch various other attacks. We propose a novel ant colony approach to detect the wormhole link by keeping a close watch on pheromone value on each BANT. The proposed technique not only detects the wormhole link but elevates the networks performance to manifest its edge over other techniques.
    Keywords: Mobile ad hoc networks (Manets); AODV; Wormhole attack; Reverse Trip Time (RTT); Ant Colony Optimization; Network simulator; DelPHI; rnrn.

  • Optimal Web Page Classification Technique using Artificial Neural Network   Order a copy of this article
    by Anusha Mallikarjun Meti, Mallikarjun M Kodabagi 
    Abstract: The rapid growth of the World Wide Web (www) is demanding for an automated assistance for Web page classification and categorization. Web page classification is a supervised learning problem which is a hard topic in the area of data mining and machine learning. The web pages contains unstructured data, mining content from web documents and classifying them is a challenging problem. In this proposed work a method for web page classification is proposed. The method comprises of three phases such as feature extraction, information learning and classification. In the feature extraction phase we will extract object based features and utilizes these features to extract informative contents of the web pages. The information learning phase makes use of decision tree (ID3 i.e iterative dichotomise 3 algorithm) algorithm to extract the rules from the features calculated. Based on the rules extracted the classification phase utilizes a hybrid classifier know as Artificial Neural Network & Group Search Algorithm with Firefly (ANN-GSOFF) algorithm to improve the web page classification. The performance of the proposed technique has been evaluated using various WebKb dataset based on parameters like sensitivity, specificity and accuracy. The overall sensitivity values of projected NN-GSOFF provides 83.12%. The overall specificity value of projected technique provides 67.42%. The overall projected NN-GSOFF provides 74.77%.
    Keywords: Artificial neural network; Classification; Feature extraction; Information Learning; Optimization; Webpage Classification;.

  • Feature Extraction and Analysis of Overt and Covert EEG Signals with Speller Devices   Order a copy of this article
    by Mridu Sahu, Saumya Vishwal, Sneha Shukla 
    Abstract: Brain Computer Interfaces are used by motor neuron disease patients for communication. Hence with the use of a BCI the improving performance of rehabilitative techniques for such patients increases a lot. The brain Electroencephalogram signals, detected and recorded by BCI devices are used for analysis. The P300 component of ERP used for the detection of attention towards a character is a popular approach. The first most successful speller device was proposed by Farewell and Donchin, named P300 Speller. The device consists of 6x6 matrix of alpha-numerics with rows and columns randomly intensified, in order to generate a stimulus in the users brain. This stimulus, which is elicited as a consequence of the rare intensification of the intended character, aids patients in communicating through an external device. This proposed method, however, works well in overt attention. But the signal to noise ratio of EEG signals is low which hinders the efficiency of communication. Hence for efficient use of BCI even in covert attention, a new method is devised. This is the Geometric Speller, used for patients with Locked-in Syndrome. rnIn our study, the two devices are compared on basis of user experience as well as efficiency and accuracy in signal detection. A dataset is selected with both the GeoSpeller and P300 Speller readings and is segmented. Statistical features are extracted from the data and are used for analysis. A comparative study of GeoSpeller and P300 Speller is showed in covert and overt attention for analysis of both the methods and their usability.
    Keywords: Geometric Speller; P300 Speller; overt and covert attention; Brain Computer Interfaces; Electroencephalography; Motor Neuron Disease.

  • A System to prevent toiletry (lavatory) based diseases as Norovirus, Staphylococcus, Escherichia and Streptococcus through IoT and Embedded Systems.   Order a copy of this article
    by K.V. Shriram, Giridhararajan R, Ikram Shah, Karthikeyan S, Abhishek SN 
    Abstract: One would be surprised to know that the toiletry based diseases play a major role in world hygiene. Many a times, it has even been fatal, killing so many. One side there are no toilets and people are still using open spaces. Another side, there are toilets, but, they lack maintenance. Both are dangerous. Not only at home, lavatories as everyone knows, are everywhere. From the simplest of the bus stands in interior villages to the most sophisticated airports, toilets play a major role. Also, lavatories inside the train, flight or any other mode of transport is inclusive on the list. All of these lavatories, if not maintained well, would lead to many disastrous and dangerous diseases like Staphylococcus, Escherichia, and Streptococcus. Surveys reveal that these unclean lavatories are agents which mostly target the children and victims are mostly under 5. Hence, it is important with this much abundant technology growth to provide a technical solution to monitor the quality of the lavatories, so as to maintain it as and when required. Here, we aim at getting a frugal IoT based solution towards monitoring the quality of the lavatories, on the go while alerting the concerned one to carry out the necessary action. Also, we have got a simple feedback mechanism which can be used to alert the concerned about the quality of the lavatories. We capitalized the exploration of IoT, Data analytics, and embedded systems to build this system. The system is tested for its working and it is observed that, if implemented, it would be the definite value add for the users and would help in building the healthy chain of lavatories.
    Keywords: cleanliness; unclean lavatories; toiletry diseases; IoT; Cloud; Data Analytics; On the go; feedback; hygiene.

  • An Ant Colony Algorithm of Recurrent Target Assignment For a Group of Control Objects   Order a copy of this article
    by Viacheslav Abrosimov 
    Abstract: In practice, a common problem is to perform periodic monitoring of a territory under uncertain conditions when the current situation evolves unpredictably. In this case, at each cycle the routes of control objects must take into account the situation at the previous cycle. We consider a target assignment approach for a group of control objects performing such monitoring. For each object in the group, the routing problem is solved using an ant colony algorithm that involves in explicit form the situation parameters defined at the previous cycle of monitoring.
    Keywords: vehicle routing problem; ant colony algorithm; control object; recurrence; situation intensity.

  • ADaas: A Secure Attribute Based Group Signature based Agri-Cloud Framework   Order a copy of this article
    by E. Poornima, N. Kasiviswanath, C. Shoba Bindu 
    Abstract: Cloud computing technique is very helpful for the farm management. This technique helps to increase the productivity of the farmers and also provide the protection of their product. It is an emerging way of computing in which applications, data and resources are provided as service to the user in the web. Some practical challenges that are faced while communicating among the farmers are poor knowledge about weather forecast, deficient production information, information about sales and distributions of products. In this paper, Agri-Cloud framework is developed to improve the cloud computing framework of Agriculture Data as a Service (ADaaS) over the public cloud. This model providesaccurate information to the various stakeholders. To improving the Agri-Cloud framework security Attribute Based Group Signature (ABGS) system is used, that provides secure data sharing in cloud data center. The experimental analysis demonstrates that the proposed Agri-cloud framework is better than the existing models such as modified water cloud model, Agro cloud model and AgroMobile model.
    Keywords: Agriculture Data as a Service; AgroCloud; Agri-cloud Model; Cloud Infrastructure; Key Generation; Elliptic Curve Digital Signature; Signature generation;.
    DOI: 10.1504/IJAIP.2021.10020105
  • Exploring real domain problems on the second generation neural network   Order a copy of this article
    by Amit Gupta, Bipin Kumar Tripathi, Vivek Srivastava 
    Abstract: This paper presents a competitive performance of second generation neural network (CVNN) on the two dimensional space over first generation neural network (RVNN) on single dimensional space. The real datasets problems are selected for proposed research work. The second-generation neural network is based on the theory of complex number. Complex numbers are forms of subset of real numbers having magnitude and phase to represent a real valued phenomenon. For the testing and training of real valued problems in complex domain, a mathematical approach hilbert transformation is used to convert all the real valued data in complex form by sifting the phase by
    Keywords: real value neural network; complex value neural network; complex activation function; back propagation algorithm; hilbert transformation.

  • A node combination approach with fuzziness in shortest path problem   Order a copy of this article
    by Pushpi Rani, Dilip K. Shaw, Jayakrushna Sahoo 
    Abstract: Shortest path problem is one of the most popular and frequently usedrnnetwork optimization problem. In this paper, a method fuzzy node combinationrnis proposed to find the shortest path under uncertain environment. The proposed method incorporates fuzziness in node combination algorithm, an alternative to Dijkstras algorithm. An illustration for the proposed fuzzy node combination method is presented and impact of the method is evaluated in a transportation network. Experimental results reveal that the fuzzy node combination algorithm is more efficient than the existing fuzzy shortest path finding methods.
    Keywords: Fuzzy sets; fuzzy number; node combination; canonical representation; graded mean integration.

  • Map Reduce Approach For Road Accident Data Analysis Using Data Mining Techniques   Order a copy of this article
    by Nagendrababu Cs 
    Abstract: Now a days, the most life-threateningrisk to humans are road accidents. Traffic accidents that cause a lot of damages are occurring all overthe places. The best answer for these sorts of accidents is to foresee future accidents ahead of time, giving driver's odds to maintain a strategic distance from the perils or decrease the harm by reacting rapidly. Anticipating accidents out and about can be accomplished utilizing characterization investigation, an information mining system requiring enough information to fabricate a learning model. Notwithstanding, developing such an anticipating framework includes a few issues. It requires numerous equipment assets to gather and dissect activity information for foreseeing roadaccidents since the information to a great degree. The motivation behind this manuscript is to fabricate an anticipating structure that can resolve every one of these issues. This paper recommends utilizing the guide decrease structure to process and dissect huge activity information proficiently. In view of this, the anticipating framework first pre-forms the huge activity information and investigations it to make information for the learning framework. To enhance the foreseeing precision, amended information are arranged into a few gatherings, to which characterization investigation is connected.
    Keywords: Road accident prediction;Map Reduce;Clustering;pre-processing;Association rules;data set.

  • shunt active power filter with a Three-Phase Neutral-Point-Clamped Inverter for power quality improvement based on Fractional order proportional integral control   Order a copy of this article
    by Hayette Dendani, Ahmad Taher Azar, Amar Omeiri, Mohamed Adjabi, Sihem Ghoudelbourk, Djalel Dib 
    Abstract: The quality of the electrical wave is far from being perfect, due to the use of non-linear loads that generate current harmonics and consume reactive power. This quality of wave can be altered by several types of disturbances. Knowing the origins and the effects of pollution harmonics on the electrical networks, this study has for solution to implement a system of active filters with three levels inverter hysteresis-based control which injects into the network a current equal to those absorbed by the polluting load, but in opposition of phase with these, thus leading the supply current to be sinusoidal. The regulation and the stability of the power supply of the filter during a variation of the load is ensured by a classical PI and then by a fractional PIα. A comparative study has been conducted and the results have been validated and improved under the MATLAB/Simulink environment.
    Keywords: Active filters; shunt active power filter; Reactive Power; Harmonics; fractional order PI controller (FOPI).
    DOI: 10.1504/IJAIP.2018.10038185
  • Cooperative Retransmission Based MAC Method for Underwater Sensor Networks   Order a copy of this article
    by H. Abdul Gaffar, P. Venkata Krishna 
    Abstract: Designing a medium access control (MAC) is a challenging task for underwater sensor networks (USN). In this paper, a cooperative retransmission based medium access control (CRMAC) method for an underwater sensor network is proposed. When the direct transmission between the source node and the sender node failed, the retransmission imitated. The cooperative nodes are used for retransmission. In CRMAC method, the cooperative nodes are selected based on virtual backoff algorithm method. The CRMAC method is compared with Slotted FAMA. The simulation results show that the proposed method performs well compared with Slotted FAMA in terms of throughput and packet delivery ratio. rn
    Keywords: Underwater Sensor Network; Cooperative retransmission; medium access control,throughput.
    DOI: 10.1504/IJAIP.2018.10023068
  • Facial Expression Recognition of Multiple Stylized Characters using Deep Convolutional Neural Network   Order a copy of this article
    by Yogesh Kumar, Shashi Kant Verma, Sandeep Sharma 
    Abstract: Human faces manifest the treasury of their abilities including emotions, character, state of mind and many more. Apart from the things that are spoken, human faces conveys plenty of information in the form of facial expressions. Recognition of facial expressions has become significant in the discipline of human-computer interaction to attain the emotional state of human beings. This paper proposes a Facial Expression Identification Method (FEIM) for the recognition of six basic facial expressions (anger, sad, fear, happy, surprise and disgust) plus one neutral emotion. The features are extracted by implementing an integrated Gabor and Local Binary Pattern (LBP) feature extraction method and the concept of Principal Component Analysis (PCA) is executed for feature selection. A deep neural network is trained for the FERG-DB (Facial Expression Research Group Database) dataset to classify the facial expression images into seven emotion expression classes (anger, fear, disgust, happy, neutral, sad, and surprise). The effectiveness of the proposed system is manifested by comparing the recognition rate results with state-of-the art-techniques. The overall results in terms of precision, recall and f-score also favours the efficacy of proposed method.
    Keywords: Facial Expressions; Deep Learning; Convolutional Neural Network; Deep NeuralrnNetwork; Facial Features; Gabor Filter; Principal Component Analysis; Local Binary Pattern.

  • Some results on edge irregular product vague graphs   Order a copy of this article
    by Abolfazl Lakdashti, Hossein Rashmanlou, P.K. Kishore Kumar, Ganesh Ghorai, Madhumangal Pal 
    Abstract: Recently, vague graph is a highly growing research area as it is the generalization of the fuzzy graphs. In this paper, we analyzed the concepts of edge regular product vague graphs and its properties. The concepts of edge irregular product vague graphs, strongly edge irregular product vague graphs are analyzed with properties.
    Keywords: Product vague graph; edge regular and irregular product vague graph; strongly edge irregular.
    DOI: 10.1504/IJAIP.2018.10024076
  • Satellite Image Matching and Registration using Affine Transformation and Hybrid Feature Descriptors   Order a copy of this article
    by N.S. Anil, D.N. Chandrappa 
    Abstract: Image Registration (IR) is a primary image processing technique to determine geometrical transformation that gives the most accurate match between reference and floating images. The main operation of IR aligns multiple images for matching and finding the differences among them as well as produce the essential information among multiple images. Although several exiting methods are used to reduce the manual process associated to inter and intra operator subjective and decrease the time consuming task. In existing work, the inliers ratio and outliers ratios are equal. This reduce the image registration accuracy. So, to avoid the outliers ratio and increase the inliers ratio Hybrid Invariant Local Features descriptor is proposed. This feature descriptor consists of Binary Robust Invariant Scalable Key point (BRISK), Speed Up Robust Feature (SURF), and Feature from Accelerated Segment Test feature (FAST). The Hybrid feature descriptor extracts the relevant features from the image. Then, the feature matching step finds the correct correspondences from the two sets of features. After that, affine transformation avoids the false feature matching points. An experimental analysis describes the inliers ratio and repeatability evaluation metrics performance of individual feature descriptor, combined feature descriptor and proposed feature descriptor. The proposed hybrid feature descriptor achieves inliers ratio of 1.913 and repeatability is 0.121. So, compared to the exiting methods, the proposed hybrid feature descriptor shows better results.
    Keywords: Affine Transformation; Binary Robust Invariant Scalable Key point; Feature from Accelerated Segment Test feature; Histogram Equalization; Image Registration; Line Blending; Speed Up Robust Feature;.
    DOI: 10.1504/IJAIP.2021.10035732
  • Node Replication Attacks in Mobile Wireless Sensor Networks   Order a copy of this article
    by Mojgan Rayenizadeh, Marjan Kuchaki Rafsanjani 
    Abstract: the mobile wireless sensor networks (MWSNs) commonly operate in hostile environments such as battlefields and surveillance zones. Owing to their operating nature, MWSNs are often unattended and generally are not equipped with tamper-resistant tools. Because of the mobility of the nodes in MWSN, this is more vulnerable than static WSN (Wireless Sensor Network). The node replication attacks are one of the most insidious attacks in sensor networks. The attack can be the foundation of many attacks. In order to detect and prevent this attack, many methods have been proposed. Although several countermeasures exist, almost all practical schemes assume a stationary network model where sensor nodes are fixed and immobile so they arent suitable for MWSNs. In this article, we intend to consider several node replication attacks detection methods in mobile wireless sensor network that they use different strategies to detect, and we compare them according to some parameters such as memory overhead and communication overhead.
    Keywords: Mobile Wireless Sensor Network; Node replication attacks; Wireless network security; Detection method.

  • Adapted Bucolic and Farming Region Pattern Classification Using Artificial Neural Networks for Remote Sensing Images   Order a copy of this article
    by P.S.Jagadeesh Kumar 
    Abstract: This paper explicates the utilization of multi-layer perceptron neural networks based procedural for the classification of bucolic and farming regions of remotely sensed images. In this paper, spectral remote sensing images were apprehended in rural and agricultural taxonomy. Cumulative histogram, voronoi tessellation, spatial pixel matrix extorted from the geographical information system were used for the training of the dataset as endowment to the multi-layer perceptron neural networks. The obstinacy of image texture features using voronoi tessellation was prompted to be principal for aerial image pattern classification of bucolic and farming regions.
    Keywords: Agricultural Region; Cumulative Histogram; Geographical Information System; Multi-layer Perceptron Based Neural Networks; Pattern Classification; Remote Sensing Image; Voronoi Tessellation.

  • Damping of oscillations in multi machine power system by PSO-GWO optimized dual UPFC based controller   Order a copy of this article
    by Narayan Nahak, Ranjan Kumar Mallick 
    Abstract: In this work, hybrid Particle Swarm Optimization-Grey Wolf Optimizer (PSO-GWO) technique is proposed to tune the parameters of UPFC based dual damping controller. The proposed optimized controller is applied to damp inter area oscillations in a multi machine power system. The dual controller simultaneously controls two independent control actions of UPFC, which are modulation index of series converter and phase angle of shunt converter. Before being tested to multi machine system, the controller is tested in a single machine system to validate its efficacy. The results obtained with proposed controller are compared with PSO and GWO optimized controller to prove its supremacy. A broad comparison is performed between single lead-lag and dual controller optimized by PSO, GWO and PSO-GWO techniques. The system responses and eigen values show that proposed PSO-GWO optimized dual controller damps oscillations to a large extent in contrast to all other single and dual optimized controllers.
    Keywords: FACTS;UPFC;PSO- GWO; dual damping controller;multi machine stability;.

  • An IoT based accident severity detection for automobiles with alerting the appropriate location of the accident - An innovative attempt   Order a copy of this article
    by Juluru Anudeep, G. Kowshik, G.I. Aswath, Shriram KV 
    Abstract: Although occupant protection systems are aiding the present means of transportation like cars etc., the statistics of crash severity surveyed from past few years indicate that the mortality rate has increased to about 35%[1] indicating the need of augmenting the quality of service to be given to the citizens. An article from Times of India tells that 27% of the deaths caused in India are due to the lack of medical attention [2] and delay in the medical help which has been a persistent cause for deaths mainly for the accidents occurring on the highways. NHTSA (National Highway Traffic Safety Administration) says that on an average about 15,913 [3] accidents occur per day in the USA based on the statistics of a survey for a period of 5 years. For an instance consider a place on a long highway where there exists only one hospital with basic requirements like emergency ward, ambulance service and an Operational Theatre and let us suppose that 4 to 5 accidents had happened in the same province of the hospital. Now, the hospital authorities will be in a dilemma because there is uncertainty in the decision to where the ambulance must be sent first. If the ambulance is sent to the nearest place where the severity of the accident is very low, the person with a big hit will succumb to death fast. The problem lies in intimating the severity of the accident to the hospital. There are systems designed to detect the collision and implant the airbags and safety measures, but many systems dont measure the severity level of the accident. So, here we introduce a system to measure and intimate the severity of the accident and even our system provides the geotag i.e., details of the place where the accident has happened and time stamp. The whole system is built with Force sensitive resistors (FSR) which are capable of detecting an impact accurately and a GPS module is used to get the data of the longitude and latitude of the area of the crash. The data from the sensors are processed using a python script which checks whether the crash has occurred or not. This system mainly strives in decreasing the delay in the time of arrival of medical services to the place of accident or crash. It also helps the hospitals in deciding the right place for the medical services to be sent in case of multiple accidents at the same time by using Severity level.
    Keywords: Severity level; GPS data; force sensitive resistor; Collision; webpage;.

  • An Exploratory Data Analysis on Rating Data using Recommender System Algorithms   Order a copy of this article
    by Lakshmi Pathi 
    Abstract: Day to day the uploading of data into world wide web and E-commerce directed the development of Recommender Systems. RecommenderSystem filters the information based on the users interest. Recommender Systems are being used in every domain now a day. The advantage of Recommender System is making search easy. Recommender Systems are classified into Content Based Filtering, Collaborative Filtering and Hybrid Approach. In this paper, we analysed the performance of Item Similarity, Matrix Factorization and Popular recommender algorithms and evaluated with Precision- recall and Root Mean Square Error metrics.
    Keywords: Recommender Systems; Collaborative filtering; Matrix factorization; evaluation metrics.

  • Towards A Standard-Based Model of System Dependability Requirements   Order a copy of this article
    by Ghadeer Al-Qahmouss, Khaled Almakadmeh 
    Abstract: System dependability is a quality factor indicates that system is able to provide trusted services and system failures will not cause catastrophic or unexpected events. The identification of dependability requirements help to develop dependable real-time critical systems in order to build the user's trust. The review of literature shows that there is no standard-based model that captures dependability requirements for all types of real time critical systems. This paper presents a standard-based model for capturing system dependability requirements of real time critical systems by identifying dependability requirements using the concepts exist in ISO25010 and ISO19761 international standards. Further, this paper presents an approach that demonstrate the practical steps needed to build a dependability requirements model. An experiment is conducted using the requirements specifications of an aircraft control system is presented to verify the applicability of the proposed standard-based model to model the required dependability requirements of such real-time system.
    Keywords: Dependability; Real-Time Systems; System Requirements; ISO19761; ISO/IEC 25010.

  • A Review on various Energy Sources and Harvesting Techniques in WSN   Order a copy of this article
    by Immanuvel Arokia James K, Prabakaran R, Kanimozhi R 
    Abstract: This paper proposes a various energy sources and harvesting methods applicable in Wireless Sensor Network field. It is strongly believed that the performance of the wireless sensor network can be improved by selecting proper energy sources. Similarly by implementing good harvesting methods, better routing algorithms, and battery (power supply) lifetime should be matching to that of sensor node lifetime. It can facilitate to increase the lifetime of the WSN. By implementing an efficient energy harvesting technique, it could remove the needs of frequent replacement of energy sources. Hence it offers a near perpetual network operating environment. This paper intends to help the researchers to design a better EH techniques for WSN based applications with the help of ambient energy sources. Mainly the consumption of energy by the sensor nodes are considered for doing further research. An examination of the different sources and fundamental measurements to get adequate vitality and different as of late proposed vitality forecast models that can possibly expand the vitality gathering in WSNs are discussed. We have completed an entire report on different vitality reaping strategies contrasting their exhibitions and have anticipated some future extents of this Vitality collecting research. Also, a portion of the difficulties that still should be routed to create effective and dependable vitality gathering frameworks for WSNs condition will be exhibited.
    Keywords: WSN Applications; Energy Harvesting; Performance enhancement.

  • Optimal Path Planning with Hybrid Firefly Algorithm and Cuckoo Search Optimization   Order a copy of this article
    by Monica Sood, Vinod Kumar Panchal 
    Abstract: Background/Objective: Path planning is one of the core and extensively studied problems in robotics. The scope of path planning is not only limited to robotics, it has gained its pertinence in many of the application areas including simulations and gaming, computer graphics, very large scale integration (VLSI) and many more. This paper aims to propose an optimization algorithm to identify the optimum path from defined source to destination without any obstacle collision. Method: A hybrid algorithm is proposed by combining the properties of two swarm intelligence techniques: Cuckoo search and Firefly algorithm. The multi agent firefly algorithm makes use of the levy flight property for the random movement of fireflies and put forth the best path from defined source to destination without colliding with any of the obstacle. The property of clever cuckoos brood parasitic behaviour of imitating the pattern of hosts egg is used by fireflies to handle the present obstacles in the path. Result/Conclusion: The experimental results obtained work in an adequately acceptable agreement with the proposed hybrid algorithm. Three experiments are performed considering the red band satellite image of the urban and vegetation area of Alwar region in Rajasthan, India. The experimental results calculated indicate the efficiency of proposed hybrid algorithm as compared to individual cuckoo search and firefly algorithm. The proposed hybrid algorithm detected the optimum path at iteration number 27 with a path length of 246 pixels and with a simulation time of minimum 112 seconds and maximum 167 seconds. Whereas, cuckoo search achieved the optimum path at iteration 49 with a simulation time of minimum 179 seconds and maximum 230 seconds. In the similar manner, firefly algorithm achieved optimum path length at 56 iterations with a simulation time of minimum 151 seconds and maximum 195 seconds respectively.
    Keywords: Optimal Path Planning; Cuckoo Search; Firefly Algorithm; Nature Inspired Computing; Computational Intelligence; Machine Learning.

  • Advanced cryptography technique in certificateless environment using SDBAES   Order a copy of this article
    by C.G. Naveen Kumar, C. Chandrasekar 
    Abstract: Certificateless encryption is a kind of public key encryption which is used to eliminate the dis-advantage of traditional PKI-based public key encryption scheme and identity based encryption scheme. In the existing certificateless environment contains the lack of security which lead to the nu-merous security issues. Also the existing system does not provide the efficient certificateless environ-ment in terms of time and performance. To reduce such issues the proposed work used the SDBAES algorithm which is used to reduce the maximum of the security threats in the cloud. It is also used to increase the efficiency of the system by reducing the time and improving the performance. The expe-rimental results shows that the proposed work provides higher security and efficiency than the existing technique.
    DOI: 10.1504/IJAIP.2018.10021467
  • Diminishing the selfish nodes by reputation and pricing system through SRA scenario   Order a copy of this article
    by John Paul Antony T, Victor S P 
    Abstract: In MANET, every hub relies upon different hubs to forward the information to its expected goal. But those as it may, couple of hubs are not prepared to share the assets because of its narrow minded conduct. The reputation and Pricing system gives a solution to the existing problem. We propose a scenario of Price and Reputation system (P&RS) that helps to diminish the selfish nodes in a successful manner. Additionally by productively join the procedure of both Notoriety and Value Framework and by building the Stratified Territory cognizant Spread record Table (DAT) to internationally gather the notoriety data. The Stratified Report Assisted (SRA) frameworks overcome the deficiencies of these current frameworks by effectively consolidate the procedure of both Reputation and Price system.
    Keywords: Selfish node; Disseminated table; watches dog; Reputation.

  • Supervised Microarray Gene Retrieval System Based on KLFDA and ELM   Order a copy of this article
    by Thomas Scaria, T. Christopher 
    Abstract: Microarray gene data processing has gained considerable research interest these days. However, processing microarray gene data is highly challenging due to its volume. Taking this challenge into account, this work proposes a supervised microarray gene retrieval system which relies on two phases namely, feature dimensionality minimization and classification. The objective of feature dimensionality minimization is to make the classification process easier by weeding out the unwanted data. The feature dimensionality of the datasets is minimized by KLFDA and the processed dataset is passed to the classification phase, which is achieved by ELM. The proposed approach is evaluated upon three different benchmark datasets such as colon tumour, central nervous system and ALL-AML. From the experimental results, it is proven that the proposed combination of KLFDA and ELM works better for all the three datasets in terms of accuracy, sensitivity and specificity rates.
    Keywords: microarray gene retrieval; classification; feature dimensionality minimization.

    by Rani Bms 
    Abstract: In retinal biometrics acknowledgment rate is influenced by the vasculature unpredictability of retinal pictures.Vascular example turns out to be extremely unpredictable in effected retinal images because of pathological signs. In this paper retina verification which includes an AWN classifier to detect blood vessel structure from pathological retina. Distinct retinal feature which remains constant under pathological changes is bifurcation angle. This paper demonstrates a method for extraction of bifurcation angle. The particular bifurcation focuses had been created and positions are ascertained of a similar bifurcation indication. Sparse matrix representation used for retina template storing for optimization of memory and the template is compared .
    Keywords: Retinal biometrics;vascular;AWN classifier;bifurcation angle;retina template;sparse matrix.

  • Trust based Multi-level Secure Routing for Authentication and Authorization in WMN   Order a copy of this article
    by Parveen Kumar Sharma, Rajiv Mahajan, Surender  
    Abstract: Wireless Mesh Networking (WMN) is a developing technology, which is receiving improved attention as a high-performance, low-cost and rapid-deployment-solution for next generation wireless communication system. The main issue in WMN is providing security to the routing protocols. Few routers in WMNs exhibit malicious behaviour by snooping the exchanged data. Moreover, users may have different authorization levels such that they should not access information for which their authorization level is lower. Though, lot of papers have been available on privacy-preserving and secure routing in WMNs, they cannot address these issues effectively. Hence we propose to design a Trust based Multi-level Secure Routing protocol (RP) for Authentication and Authorization (TMSR-AA) in WMN. In this protocol, every node in the network upholds a trust table of its neighbouring nodes. The path trust value (PTV) is calculated along the route from these trust tables. Apart from trust, this protocol uses Multi-level security (MLS) mechanism in which information to be transmitted is categorized into different Security Levels (SLs). The packets transmitted in the network are assigned a SL based on the type of data. The mesh routers are assigned SL based on the estimated trust counter. During message transmission, a mesh-router with a specific SL should be allowed to send packets only at the same-level or at lower-level. The performance of the TMSR-AA-WMN protocol is compared with the ESR protocol in terms of the metrics Packet delivery ratio, Packet drop, Delay, Computational over-head, Communication Over-head and Detection Accuracy.
    Keywords: Authentication; Authorization; Communication overhead; Computational overhead; Packet Delivery Ratio; Packet drop; Secure Routing; Security levels and Wireless Mesh Networks;.
    DOI: 10.1504/IJAIP.2021.10040343
  • On Interval Covering Salesman Problem   Order a copy of this article
    by Siba Prasada Tripathy, Amit Tulshyan, Samarjit Kar, Tandra Pal 
    Abstract: After a disaster, during humanitarian relief transportation or mass fatality management, cost of journey between two places may be uncertain due to the variation of degree of devastation in the affected area. In such scenarios, a viable model is essential to handle the situation in cost-effective and reliable manner, which is able to handle this uncertainty. In this paper, we introduce Interval Covering Salesman Problem (ICSP), where cost of an edge is represented by interval number. ICSP is a variant of Covering Salesman Problem (CSP) which is helpful for many real world problems in uncertain environment. Here, we formulate a mathematical model for ICSP with uncertain cost associated with the cost of travel between two nodes/places. Here, we have proposed a Metameric Genetic Algorithm (MGA) for ICSP and presented its simulation results. For implementation, we have used some benchmark TSP instances by changing the costs to interval numbers.
    Keywords: Traveling Salesman problem; Covering Salesman Problem; Uncertainty; Interval Constraint; Metameric Genetic Algorithm; Global parent.

  • OABC Scheduler: A Multi-Objective Load Balancing Based Task Scheduling in a Cloud Environment   Order a copy of this article
    by Shameer A.P, A.C. Subhajini 
    Abstract: The primary goal of scheduling is to allocate each task to the corresponding virtual machine on the cloud. Load balancing of virtual machines (VMs) is an imperative part of task scheduling in clouds. At whatever point, certain VMs are over-loaded and remaining VMs are under loaded with tasks for scheduling, the load must be adjusted to accomplish ideal machine use. This paper proposes a multi-objective task scheduling algorithm utilizing oppositional artificial bee colony algorithm (OABC), which expects to accomplish well-balanced load across virtual machines for minimizing the execution cost and completion time. The generated solution is competent to the quality of service (QoS) and enhances IaaS suppliers' believability and financial advantage. The OABC algorithm is planned based on oppositional strategy, employee bee, onlooker bee, scout bee and suitable fitness function for the corresponding task. The experimental results demonstrate that a proposed approach accomplishes better task scheduling result (minimum cost, time and energy) compare to other approaches.
    Keywords: Cloud computing; Virtual machine; Load balancing; oppositional artificial bee colony; Time; Cost; Task scheduling.

  • Copy move image forgery detection using cuckoo search   Order a copy of this article
    by Tarun Kumar, Gourav Khurana 
    Abstract: Most of the people face the dilemma of accepting the photographs as authentic or not, mainly in the case of forensics where the images will influence the judgments. Research communities are constantly providing methods to identify these kinds of forged images. Attention captures specifically the cases where a region of the image is copied in the same image (Copy Move Forgery-CMF). To detect these forged images, a recent approach Scale Invariant Features Transform (SIFT) has proved its worth and robustness in various geometrical transformations. However, the framework needs to be optimized for numerous parameters involved cause wrong selection of values leads to wrong identifications. To solve this problem, a novel method has been proposed named as Cuckoo Search based Copy Move Forgery detection (CSCMFD) for optimizing the parameter values in the SIFT framework. The CSCMFD demonstrates to attain better results by automatically determining the values of different parameters as compared to state of art research work. Experimentation is performed on the Christlein et al. database and MICC-F220 dataset. The experimental results proved that the CSCMFD is able to capture small forged areas as well as the regions that are difficult to identify by other methods.
    Keywords: Cuckoo Search; Meta-heuristic Algorithm; Copy Move Forgery Detection; Region Duplication; Scale Invariant Features Transform.
    DOI: 10.1504/IJAIP.2018.10021468
  • An inventive and innovative approach to monitor warehouse with Drone and IoT   Order a copy of this article
    by Aswath G.I., Shriram Vasudevan, Sundaram RMD, Giri Dhararajan, Sowmiya Nagarajan 
    Abstract: In the recent years, the technology growth in the sector of USVs / UAVs has been enormous. When it comes to UAVs, Quad copters play a major role and the platforms / hardware-software availability has become abundant which offer more choices and enhanced performance. One must be aware of the usage of the UAVs towards delivery of goods, pizzas etc. Taking the growth and available facilities, we have attempted to use the Quad copters towards enhancing/increasing the efficiency in monitoring the warehouse through a frugal and cost effective approach. We have proposed to use the drones inside a warehouse for inventory monitoring. Through the literature survey, we un-derstood that there is a lot of loss because of inefficient monitoring techniques, which mostly involve human efforts. While manual verification is both time consuming and error prone, we have used Drones, Data Analytics, Android and IoT as the backbone to simplify the process. This approach is found to be affordable, accurate and viable. Our drone will fly inside the warehouse, track the goods and components rack wise, and give an alert/update to the store manager through both web interface and android application that we have developed. This way, we can track every individual box in the warehouse, while eliminating the chance for it to be lost/untracked.
    Keywords: Drones; Warehouse Inventory control; IOT controlled drone; RFID; NFC; Raspberry Pi; Android; and Intelligent Warehouse Monitoring.

  • Attribute Weight Gain Ratio (AWGR): New Distance Measure to select optimal features from multivalued attributes   Order a copy of this article
    by L.N.C. Prakash K., Kodali Anuradha 
    Abstract: Identifying the appropriate features or attributes remains the most prominent stage of any information retrieval and knowledge discovery. The process involves selecting specific features and their subsets holding the vital portion of the data. However, despite the prominence of this stage, most feature selection techniques opt for choosing mono-valued features. Accordingly, these techniques cannot be extended to use in multivalued attributes which require capturing different features from the dataset in parallel. To enable optimal feature selection for multivalued attributes, this manuscript proposes a novel technique aiming at calculating the optimal combination of multivalued attribute entries regarding clusters in unsupervised learning, and classes in supervised learning. The proposal is a distance metric that motivated from the traditional relevance assessing metrics information gain and gains ratio. To analyze the performance of the proposed technique, the classification approach SVM trained on optimal multivalued attribute features selected using proposed distance measuring metric, which is further used to perform classification process. Also, to evince the significance of the proposed distance measuring metric regarding clustering process, k-means clustering method with Attribute Weight Gain Ratio is executed on benchmark datasets. Simulation results depict superior performance of the model for feature selection for multivalued attributes.
    Keywords: Multiclass attributes; optimal feature; k-means clustering; transaction weight; mining techniques.
    DOI: 10.1504/IJAIP.2020.10021278
  • Collaborative Computing Methods With Enhanced Trust and Security Mechanisms   Order a copy of this article
    by Dileep Kumar Gopaluni, R. Praveen Sam 
    Abstract: Security and protection issues have been researched with regards to a solitary association practicing control over its clients' entrance to resources. In such a registering domain, security arrangements are characterized and overseen statically inside the limit of an association and are regularly halfway controlled. Be that as it may, growing huge scale Internet-based application frameworks exhibits new difficulties. There is a requirement for a model, and a system for demonstrating, indicating, and upholding the understanding set up by teaming up associations regarding trust and security issues. This trust understanding is expected to build up between authoritative security approaches that oversee the communication, coordination, cooperation, and resource sharing of the collective group of networks. In this paper application-level, trust-based security innovations to help Internet-based shared frameworks are introduced. In this paper a efficient collaborative method is proposed which performs network creation and authorization of nodes in network and then maintain security and trust levels on the network so as to provide secure path for data transmission among trusted nodes of a network. In the proposed work Enhanced Key Management Scheme (EKMS) is introduced for enhancing security in the network and several constraints are proposed for identifying trusted nodes in network. The manuscript also concentrates on Intrusion Detection system(IDS) for identifying any faults in the established network for smooth and efficient collaborative computing networking. The proposed method uses NS2 simulator for network creation and MATLAB environment for analyzing the performance of the collaborative network.
    Keywords: security; trust; collaborative computing; certificate authority; network authentication; intrusion detection system.
    DOI: 10.1504/IJAIP.2021.10029394
  • Development of Deep Intelligent System in Complex Domain for Human Recognition   Order a copy of this article
    by Swati Srivastava, Bipin K. Tripathi 
    Abstract: This paper aims to develop a deep intelligent system that can perform human recognition through proficient and compressed deep learning. The proposed Complex Deep Intelligent System(CDIS) incorporates multiple segments that includes image representation in lower dimensional feature space, Fused Fuzzy Distribution(FFD) and Complex Hybrid Neural Classifier(CHNC). One of the advantages of our CHNC is reduction in computational complexity because very few novel complex higher order neurons are sufficient to recognize a human identity. Further, the proposed intelligent system uses the advantages of both supervised and unsupervised learning to enhance the recognition rates. CDIS outperforms the best results accounted in the literature on three benchmark biometric datasets-CASIA iris, Yale face and Indian face datasets with 99.8%, 100% and 98.0% recognition accuracies respectively.
    Keywords: Fused fuzzy distribution (FFD); complex hybrid neural classifier (CHNC); biometric; deep architecture.

  • Big data secure storing in cloud and privacy preserving mechanism for outsourced cloud data   Order a copy of this article
    by Dr B. Renuka 
    Abstract: Big data is a buzz word in this decade it gets tremendous concentration in these days by the researchers because of the characteristics and features. And also big data gives lot of challenges to the world that is storage, processing and security. In any technology security is the prime concern in this manuscript, we map to misuse new complications of enormous information regarding security, further more, confer our thought toward viable and insurance protecting enlisting in the immense data time. Specifically, we at first formalize the general building of gigantic data examination, recognize the relating security necessities, and present a capable and assurance sparing outline for immense data which is secured in cloud.
    Keywords: Privacy Preserving; Security; Big data; Cloud Computing; outsourced data.

  • A Novel Approach for increased transaction security with Biometrics and One Time Password A complete implementation.   Order a copy of this article
    by Deveshwar H, Gowtham V, K.V. Shriram 
    Abstract: The advent of distributed and ubiquitous computing systems have resulted in the increase of digital financial transactions, consequentially making security a primary concern. Here we address the problem by proposing the usage of biometric sensors embedded in mobile systems to authenticate and generate a One Time Pin (OTP), as opposed to the existing systems that incorporate static and constant pins. This reduces the risk of spoofing and will make the user impervious to attacks on the Automatic Teller Machine (ATM) centers. We have proposed the usage of a central server that keeps track of requests and processes for the same. This ensures a wider scope for randomization of the pins, hence reducing predictability to almost zero.
    Keywords: Biometrics; Fingerprint; One Time Pin (OTP); Mobile devices; Transactions; Debit/Credit cards.

  • Performance Measures of Diseases Affected Iris Images using Sigmoidal Multilayer Feed Forward Neural Network   Order a copy of this article
    by S.G. Gino Sophia, V. Ceronmani Sharmila 
    Abstract: Iris is a scarce natural password used for human identification with reliability and security. The iris is affected by the number of diseases, so it leads to affects the iris recognition process. So study and analyze the types of diseases affecting the eye images. The localization of an iris is to perform using edge detection with the parameters of neighbors of a pixel and the structuring element of the morphological technique. The iris images are trained by the neural networks and analyzing the regression and performance graphs. Compare the various diseases affected iris and normal images using the periodogram power spectral density of matlab.
    Keywords: Enhancement; Histogram; Localization; Neighbors of a pixel; Neural Networks; Regression; Normalization.
    DOI: 10.1504/IJAIP.2021.10023728
  • Circular Local Search for Unconstrained Optimization Problems   Order a copy of this article
    by Mohammed A. El-Shorbagy, Aboul Ella Hassanien, Ahmad Taher Azar 
    Abstract: In this paper, a heuristics algorithm to solve unconstrained optimization problems (UOPs) in two dimensions is proposed. This algorithm is called as: circular local search (CLS); where it is an efficient local search. The algorithm starts with an arbitrarily chosen point in the search domain. Secondly, a radius of CLS is defined around the current search point; where any point in this region is feasible. Finally, by an angle with a decay step length, CLS can move from current search point to obtain a new base point. The radius and the angle of CLS are modified during the search. CLS is tested on evaluated by many benchmark problems taken from the literature. According to the obtained numerical results, the proposed method show that its robustness and effectiveness.
    Keywords: Circular Local Search; Unconstrained Optimization; Global Optimization.
    DOI: 10.1504/IJAIP.2018.10038186
  • Usability Estimation of Component Based Software System using Adaptive Neuro Fuzzy Approach   Order a copy of this article
    by Jyoti Agarwal, Sanjay Kumar Dubey, Rajdev Tiwari 
    Abstract: Cost effective development is the prime goal for the software developers. To achieve this goal, now a days Component Based Software System (CBSS) are developed. In CBSS, the existing components are reused to develop a new software system which increases the reusability of components. It also reduces time and efforts of software developers, which is cost effective. The success or failure of software system depends on its usability. Usability can increase the market revenue. So, to increase the acceptance rate of CBSS among the users, it is important to evaluate the usability of CBSS before the software is released. In this paper, usability of CBSS is evaluated based on four input factors by using two widely used soft computing techniques i.e., Fuzzy Logic and Adaptive Neuro Fuzzy Logic. Experimental results obtained from both the techniques are also compared and it is observed that ANFIS approach reduces the error rate and provide more accurate results. This research work will help the software developers to estimate the usability of CBSS in more efficient manner.
    Keywords: Usability; Component; Software; Fuzzy; Adaptive Neuro Fuzzy; Membership Function.
    DOI: 10.1504/IJAIP.2018.10020763
    by Suresh Cse, M. Nirupama Bhat 
    Abstract: Early examination and acknowledgment of kidney disease is a fundamental issue to help stop the development to kidney failure. Data mining and examination strategies can be used for anticipating Chronic Kidney Disease (CKD) by utilizing obvious patient's data and assurance records. In this examination, careful examination techniques, for instance, Decision Trees, Logistic Regression, Naive Bayes, and Artificial Neural Networks are used for predicting CKD. Pre-treatment of the data is performed to trait any missing data and perceive the components that should be considered in the identification models. The different careful examination models are assessed and contemplated in perspective of exactness of estimations. The examination gives a decision help contraption that can help in the identification of CKD. With the certifications of careful examination in tremendous data, and the usage of machine learning figurings, anticipating future isn't any more a troublesome task, especially to wellbeing division, that has seen a great headway following the change of new PC developments that delivered diverse fields of research. Various undertakings are done to adjust to remedial data impact on one hand, and to get important gaining from it, predict diseases and suspect the cure of course. This incited experts to apply all the particular progressions like huge data examination, farsighted examination, machine learning and learning computations with a particular true objective to remove profitable data and help in choosing. In this paper, we will present a examination on the headway of colossal data in human administrations structure.
    Keywords: keen examination; machine adjustment; huge data examination; Kidney failure problems; learning estimations; diagnostics;data examination; data mining; sensible examination.

  • Medical image watermarking technique using IWT-BSVD   Order a copy of this article
    by Sivakannan Subramani, G. Keerthana, K. Gayathri, Jaishree Sundar 
    Abstract: A special care and concealment is required for medical images, since judgment is done onthe information got through medical images. Transmission of medical image demands strong security and patent protection in telemedicine applications. A highly secured and robust watermarking technique is proposed for transmission of medical image through internet and mobile phones. The Region of Interest (ROI) and Non Region of Interest(RONI) of medical image are separated. Only RONI is used for embedded watermarks. The medical image watermarking technique presented here is based on integer wavelet transform(IWT) and bidiagonal singular value decomposition (BSVD). The original image is decomposed by IWT, then the grey scale watermark image is embedded in the bidiagonal singular values of the low-frequency sub band of the host image. An experimental result on benchmark images clearly tells that the proposed scheme is highly resistible to attacks and has good invisibility.
    Keywords: ROI,RONI; Integer wavelet Transform(IWT); bidiagonal singular value decomposition (BSVD).
    DOI: 10.1504/IJAIP.2021.10023311
    by S.Thanga Revathi, N.Rama Raj 
    Abstract: Nature inspires Human beings to a greater extent as the Mother Nature has guided us to solve many complex problems around us. Algorithms are developed by analysing the behaviour of the nature and from the working of groups of social agents like ants, bees, and insects. An algorithm developed based on this is called Nature Inspired Algorithms. These nature-inspired algorithms can be based on swarm intelligence, biological systems, physical and chemical systems. A few algorithms are effective, and have proved to be very efficient and thus have become popular tools for solving real-world problems. Swarm intelligence is one of the most important algorithms developed from the inspiration of group of habitats. The purpose of this paper is to present a list of comprehensive collective algorithms that invoke the research scope in that area.
    Keywords: Optimization algorithms; Nature Inspired algorithms; Genetic algorithms.

  • Robust medical image watermarking technique using integer wavelet transform and shearlet transform with BSVD   Order a copy of this article
    by Sivakannan Subramani, G. Thirugnanam, N. Prabakaran, J. Bennilo Fernandes 
    Abstract: There is an increased usage of digital devices in health care services for the last few decades because of the advancements made in the medical field. The manual diagnosis method has been with e-diagnosis system. The most appropriate method that is used for the enhancement of security and authentication of medical data is Medical Image Watermarking, which is crucial and used for further diagnosis and reference. This paper focuses on medical image watermarking techniques for the protection and authentication of medical data using hybrid transforms. Several developments on wavelet transform have been proposed in the field of mathematical analysis. One of the recent extensions of wavelet is shearlet transform. A hybrid scheme using Integer Wavelet Transform (IWT) and Discrete Shearlet Transform (DST) is presented in this paper. Here, the host image and then its low frequency sub-band is decomposed by using IWT and DST respectively. The selected sub-band from Shearlet transform is applied with the bidiagonal singular value decomposition (BSVD) and the gray-scale watermark image is embedded into its bidiagonal singular values. The images with different textures are examined by this method and resistance is evaluated against various attacks like image processing and geometric attacks. By this proposed method, the results are produced with good transparency and high robustness.
    Keywords: Medical image Watermarking; Integer wavelet transform; Discrete shearlet transform; Bidiagonal singular value decomposition.
    DOI: 10.1504/IJAIP.2021.10024078
  • Automatic Detection of Brain Cancer Using Cluster Evaluation and Image Processing Techniques   Order a copy of this article
    by Bobbillapati Suneetha, A. Jhansi Rani 
    Abstract: The Brain Cancer is perceived through radiologist utilizing MRI, which takes fundamentally a most outrageous time. Most of the brain tumor acknowledgment procedures give compound information about the brain tumor and they require in giving a correct result on nearness of tumor. Consequently, a formal guidance with a radiologist is necessary, which transforms into a surplus utilization if there ought to be an event of a non-tumor understanding. The objective of this work is to develop a supporting system that would assist the radiologist with having beforehand said result which reduces the time taken as a primary concern tumor revelation. The proposed procedure includes following stages. At first, the MRI (Magnetic Resonance Imaging) Brain Image is acquired from Brain MRI Image instructive gathering. In second stage the picked up MRI Image is given to the Pre-Processing stage, where the film craftsmanship marks are removed. In third stage, the high repeat parts are removed from MRI Image using distinctive filtering techniques. Finally, the proposed method investigates the best upgrade methodology, known as Ant Colony Optimization (ACO) is considered in this proposed work. The proposed techniques diminish the time unconventionality for brain tumor area which in like manner consolidates more exactness. In this work the MRI brain images are considered as info. The end clients themselves look at the MRI report by typical vitalization without counseling radiologist.
    Keywords: image processing; improvement; Median and Adaptive filter; automatic detection ,filtering; brain cancer; cluster evaluation.
    DOI: 10.1504/IJAIP.2018.10018471
  • Question Answering System for Semantic Web: A Review   Order a copy of this article
    by Irphan Ali, Divakar Yadav, A.K. Sharma 
    Abstract: The process of querying and searching huge and heterogeneous contents from Web has become increasingly challenging task with the contemporary growth of semantic web. In order, to make the vision of semantic web a reality, some user friendly interfaces are needed, that can help users in querying and searching the huge and heterogeneous information space. Because of the complexity of natural language, question answering systems over semantic web presents many challenges and thus research opportunities. This work presents a survey on question answering, which has emerged as an important tool in recent years to exploit the opportunities offered by semantic information on the Web. It provides a comprehensive view by analyzing the history of question answering research field developed in the past from open domain question answering system to the latest semantic question answering solutions, before discussing the latest developments in open user friendly interfaces for semantic web. It was tried to explore the potential of question answering techniques to go beyond the current state of the art to support users in reusing and querying the semantic web contents. In the initial part of article, the classification of Questiona Answering Systems (QAS) under four categories namely: Natural Language Interfaces to Databases (NLIDB), Open Domain Question Answering over text , Semantic Ontology-Based Question Answering and QA Systems based on types of questions has been discussed in detail. Later part of the work, it was identified the common challenges of Question Answering over Semantic Web, solutions, and suggested recommendations for future systems. In the last, it concludes the review with an outlook to the techniques that need to be pursued to realize the aim of efficient retrieval of answers from huge, heterogeneous and continuously evolving semantic information on the Web.
    Keywords: Ontology; semantic web (SW); question answering systems (QASs); natural language processing (NLP); named entity recognition (NER).
    DOI: 10.1504/IJAIP.2018.10042066
  • A Frugal and Innovative, Intelligent Messaging Assistant A Futuristic approach   Order a copy of this article
    by Shriram Vasudevan, Rahul Ignatius, Himanshu Batra, Aswin Tekur 
    Abstract: : For systems receiving large amount of information and requiring prompt decision making, it is the need of the hour to facilitate quick sifting of the data (regardless of scale) and follow a specified course of action at the least possible response time, to provide real time decision making capability. Our system allows users to handle large amount of data streaming in and adopt an appropriate course of action with a trained data model approach. Our system will assess incoming text (or audio which can be converted to text by a Speech to Text conversion system) and decide the criticality (i.e. urgency) of the message based on datasets pre-marked and certain configurations preset by the user. Our system will evaluate the received request and generate a score based on multiple parameters (such as geo, user history, time). A larger score implies that the message is of high importance and a low score implies a message is trivial and may be discarded or placed in a spam folder, based on users specified preferences. The provision of a real time decision system will be very useful for use cases where large amount of information is received and course of action is to be decided instantly. Scalability of conventional systems is an issue as deployment of more personnel to facilitate decision-making may not be feasible. We elaborate three use cases for our system, enterprise scenario, a first responders scenario and a personal use case scenario (as a personal message/call assistant).
    Keywords: Message filtering; real-time decision system; scalable message ranking system; scalable text classification system.

  • A New Fuzzy and Gaussian Distribution Induced Two Directional Inverse FDA for Feature Extraction and Face Recognition   Order a copy of this article
    by Aniruddha Dey, Shiladitya Chowdhury, Jamuna Kanta Sing 
    Abstract: In the area of face recognition research, the high dimensionality of the data is indeed a crucial problem. The two-dimensional inverse Fishers discriminant analysis (2DIFDA) is the most popular method for binary class assignment. This paper proposes a new fuzzy and Gaussian distribution induced two directional inverse Fishers discriminant analysis (FGD-2DIFDA) which computes the fuzzy and Gaussian distribution membership values and thereby combined the values with the training samples to obtain the class-wise mean and the global mean. These fuzzy membership values are taking account in inter- and intra-class scatter matrices along x- and y-axis. Moreover, the intra-class scatter matrices include the Gaussian probabilistic distribution information. Finally, the eigenvalue problems are solved to fins the optimal inverse projection vectors. These vectors are used to generate significant discriminant features and to solve the binary classification problem. The FGD-2DIFDA method has been evaluated on the AT&T (formally known as ORL), UMIST and FERET face databases using support vector machine (SVM). Simulation results demonstrate that the proposed FGD-2DIFDA method can obtain higher recognition rates than some state-of-the-art face recognition methods.
    Keywords: FGD-2DIFDA; projection vector; FKNN; SVM; Gaussian probability distribution; feature extraction.

    by Ch Mani 
    Abstract: With the rise of the Internet, security has transformed into a critical concern. An interference distinguishing proof structure is used to redesign the security of frameworks by evaluating all inbound and outbound framework practices and by perceiving suspicious cases as possible intrusions. From two decades, various researchers are working in Intrusion Detection Systems. Of late, characteristic acknowledgment has gotten unmistakable quality with its ability to perceive novel ambushes. Nowadays masters focus on applying special case distinguishing proof frameworks for irregularity revelation in light of its promising results in perceiving bona fide ambushes and in lessening false alert rate. In this paper, characteristic acknowledgment on systems database assurance is analyzed and their results are examined. Inconsistency area in organize frameworks is a continuously creating field with persuading applications in regions, for instance, security, acknowledgment of framework interferences, wicked in arrange courses, and human sciences. In this original copy a few noteworthy components are viewed as like, a framework structure and an information suspicion driven inconsistency revelation technique. We demonstrate their ampleness and results on real datasets with scholarly access administration from the regions of social, spatial, and stage walled in area frameworks. Finally, we offer an outline to widen the examined irregularity acknowledgment frameworks to the dynamic setting of graphs. In this original copy we utilized MATLAB condition for discovery of exceptions on systems database and the proposed strategy execution is likewise examined.
    Keywords: IDS;organize inconsistency;security;database assurance;recognizable proof of exceptions.

  • Regression Test Case Prioritization Using Genetic Algorithm   Order a copy of this article
    by Anand Kumar Yadav, Anil Kumar Malviya 
    Abstract: On customers demand, new requirements are implemented in the software. The modified software may not work properly as earlier because of the new requirements added. So the modified software must be tested. RT (Regression Testing) is defined as retesting of the modified software. It is performed using the already developed test suite and a newly developed test suite. The big software has a larger test suite size. For a single requirement change, to run the whole test cases is not beneficial for the development organization. To make RT more effective, prioritization of test suite is done. Here we present the GA (Genetic Algorithm) for the TCP (test case prioritization). Different approaches have discussed and implemented using the APFD (Average Percentage of Fault Detected) metric. The discussed approaches are applied over a single problem and the result is shown in the tabular form. APFD metric is applied to all the discussed approaches and suggested which one is better. This paper uses GA to arrange the test cases in a prioritized way on the basis of the fault detected.
    Keywords: APFD; Genetic algorithm; Regression testing; Test cases prioritization.

  • Brain Tumour Segmentation using Weighted K-Means based on Particle Swarm Optimization   Order a copy of this article
    by Naresh Pal 
    Abstract: In medical science, Image Segmentation (IS) is a challenging task, it subdivides the image into mutually exclusive regions. An IS is the most fundamental and essential process of classification, description and visualization of the region of interest in several medical images. In the medical field, diagnosis of brain and other medical images are using Magnetic Resonance Imaging (MRI), which is a very helpful diagnostic tool. The traditional technique using MRI Brain Tumour Segmentation (BTS) is extremely time consuming task. This research paper concentrates on the improved medical IS method based on hybrid clustering methods. This hybrid technique is a combination of Weighted K-Means and Fuzzy C-Means (WKFCM), K-means and Particle Swarm Optimization (KPSO). The proposed techniques, identify the brain tumour accurately with less execution time. An experimental result demonstrated that proposed hybrid clustering technique performance is better than the earlier methods like FCM, KM, Mean Shift (MS), expectation maximization, and PSO in three different benchmark brain databases.
    Keywords: Weighted K-means; Fuzzy C-means; image segmentation;.

  • Object Recognition based on Topology Preserving Skeleton Features   Order a copy of this article
    by N. Neelima, A. Srikrishna, K. Gangadhara Rao, M. Pompapathi 
    Abstract: Object recognition is a procedure for recognizing a particular object in an advanced video or image.Appearance-based or feature-based techniques are used for object recognition. An object skeleton is the useful cue for object recognition, which provides a structural representation to specify the relationship among object parts. The shapes geometry and topology can be efficiently encoded. In this paper, an effective object recognition method is introduced with the help of Multi-Kernel SVM using the skeleton derived from topology preserving skeleton features method.
    Keywords: skeleton; classification; object recognition; support vector machine; junction points.
    DOI: 10.1504/IJAIP.2018.10026905
  • Non Linear Tensor Diffusion Filter for the denoising of CT/MR images   Order a copy of this article
    by Kumar S.N., A. Lenin Fred, Ajay Kumar H, P.Sebastin Varghese 
    Abstract: The partial differential equation based algorithms play a prominent role in image processing and computer vision applications. The anisotropic diffusion technique was widely used for image enhancement and denoising. The Perona Malika algorithm based on anisotropic diffusion fails to preserve sharp edges and fine details in the denoised image. In this paper, the variants of Perona Malika (PM) model; Non Linear Scalar Diffusion (NLSD) filter and Non Linear Tensor (NLTD) filter are analyzed. The algorithms are analyzed on sheep phantom image corrupted with gaussian and rician noise and results were validated by performance metrics like PSNR, MAE, EPI and MSSIM. The NLTD filter produces superior results when compared with NLSD and PM filter. The NLTD filter was also found to yield efficient restoration results for real time CT/MR images and was validated by entropy measure.
    Keywords: Denoising; Perona Malika; non linear scalar diffusion; non linear tensor diffusion; rician noise; gauss gradient operator.

  • Reliability Evaluation for deployment of Multi Service Multi Functional Service Oriented Computing based on different techniques   Order a copy of this article
    by Abhijit Bora, Subhash Medhi, Tulshi Bezboruah 
    Abstract: Reliability evaluation of Service Oriented Computing against massive stresses of end users imputes greater emphasis on adaptability of different hosting techniques. As such, we propose to study the failure and reliability aspects of multi service based web services using Tomcat and Internet Information Service web server. The service roles such as broker, parent and service provider are segregated among different services. We present here in detail the architecture, a reliability evaluation framework, experimentally and statistically analyzed results to establish the applicability of the proposed work. The overall assessment of such deployment in respective web servers highlights the dynamic nature of service reliability. The results shows that the reliability of Internet Information Service is better than Tomcat web server but the efficiency of processing massive request is comparatively poor than its counterpart. Also, the reliability of segregation of web service role is observed to be better than the merged roles.
    Keywords: Web Service Communication Foundation; SOAP; Web Service; Load Testing; Reliability; Web Server.
    DOI: 10.1504/IJAIP.2021.10032121
  • An Embedded System and IoT based approach to determine milk quality at milk collection center - pertaining to Indian conditions.   Order a copy of this article
    by Juluru Anudeep, Kowshik G, Shriram KV 
    Abstract: Milk is considered to be the ideal food because of its abundant nutrients required by both infants and adults. It is one of the best sources of protein, fat, carbohydrate, vitamins and minerals. Possible reasons behind adulteration of milk may include demand and supply gap, perishable nature of milk, low purchasing capability of customer and lack of suitable detection tests. India is one of the largest consumer and producer of milk in the world. A recent study by Indias food safety regulator (FSSAI) found that 68.4% of milk samples examined didnt meet its standards [1]. In 2008, 6 teenagers died of food poisoning in the eastern state of Jharkhand after drinking sour milk at their boarding school [2]. In Spite of implementing various precautionary methods, government and private stakeholders are still struggling to uproot adulteration from the food supply chain. Our innovation is focused on early detection of milk adulterants and ensures the quality of milk for the consumers. We have come up with an IoT solution that will consistently track sensitive and crucial parameters of milk and uploads the parameters into the website by performing data analytics so that user can have a track of their consuming milk.
    Keywords: IoT; sensors; data analytics; webpage; pH; temperature monitoring; milk colour detection; Lactometer.

  • A method for Solving Cold start problem Using Market Basket Analysis.   Order a copy of this article
    by Nitin Mishra 
    Abstract: Recommendation System is the base of E-commerce business across the world. After the advent of 4G technology in developed and developing countries rn, people are using internet more than ever. Lot of options are available for almost everything on Internet. People are confused with all the options. Now a days screens have become smaller and data has become many times. It has been observed that sometimes people leave the portal although information is there. Due to this, web application is unable to present users for their need. Recommendation system makes this easier by giving users options on the basis of history of the user in the system. Now, you can get choices on the basis of your likes and dislikes. But, this recommendation system fails when we have no information about the user and item. In simple words, we did not have user history and we cannot use recommendation algorithm. In this paper, we are suggesting a market basket Analysis (MBA) technique to help us solving this problem to some level. Using data available by Movielense,we develop our model and test on movie domain. We have used movielens dataset as a dataset to prove our method. Market Basket Analysis technique have been used to determine popularity sequence of the movies.rnWe have tested our method on the movielense dataset and be found that a consistent performance between 30 to 60 percent can be obtained.
    Keywords: Recommender systems,Cold-start Problem,Market Basket Analysis,Associative Rule Mining.

  • Testing and Evaluation of Crowd Management Strategies at Religious Gatherings in India using Agent Based Modelling and Simulation   Order a copy of this article
    by Abha Trivedi, Mayank Pandey 
    Abstract: Crowdmanagement in religious gatherings is a complex and non-trivial task. Lack of planning and management has resulted inmany unfortunate incidents of stampede in the past. Though local authorities deploy crowd management personnel to avert and control this crisis, some unforeseen events still lead to crowd panic resulting in stampede. These personnel follow pre-defined crowd control strategies and manuals. These strategies are deployed directly over real time situations without prior testing. In this paper, we have proposed a simulation framework using Agent Based Modelling to provide near to real scenarios of crowd gatherings, their interactions and behaviour at religious places. This crowd model is further utilized to test and evaluate crowd management and control strategies. Steps are also suggested to remove the shortcomings of the strategies. We have taken Alopi Devi temple of Allahabad, India as a demonstrative case study. The simulation results establish the applicability of our methodology.
    Keywords: Agent Based Modelling; Spread of Rumour; Stampede; Crowd Management; Control Strategy.
    DOI: 10.1504/IJAIP.2021.10025581
  • Audio-Visual Speech Recognition based on Machine Learning approach   Order a copy of this article
    by Saswati Debnath, Pinki Roy 
    Abstract: Audio-Visual speech recognition by machine plays an important role when research in automatic speech recognition reaches its highest performance. Audio alone also gives good performance, but adding the visual information potentially gives more convenient recognition system when an audio signal degrades in a noisy environment and may vary because of the environmental channel. This paper proposes audio-visual automatic speech recognition (AV-ASR) system based on machine learning approaches. Visual information is captured from lip contour. Pseudo Zernike Moments (PZMs) and 19th order Mel Frequency Cepstral Coefficients (MFCCs) are extracted to obtain visual information and audio feature respectively. Machine learning approach, Artificial Neural Networks (ANN) and Support Vector Machines (SVM) are used to recognize speech for audio and visual modality. After the individual recognition of two systems, a combined decision is taken. This paper also evaluates the individual performance of both audio and visual speech recognition by machine learning approach.
    Keywords: Audio-visual speech recognition; lip tracking; Pseudo-Zernike Moment; MFCC; ANN; SVM.
    DOI: 10.1504/IJAIP.2018.10039010
  • On Roman Domination of Circular-arc Graphs   Order a copy of this article
    by Akul Rana, Angshu Kumar Sinha, Anita Pal 
    Abstract: Let G = (V;E) be a graph with vertex set V and edge set E. A Roman dominating function is a mapping f : V → {0; 1; 2} such that every vertex u for which f(u) = 0 is adjacent to at least one vertex v with f(v) = 2. The weight of a Roman dominating function is the value $f(V)=sumlimits_{vin V}f(v)$. The minimum weight of a Roman dominating function on a graph G is called the Roman domination number $gamma_R(G)$. The Roman domination problem on a graph G is to fi nd $gamma_R(G)$. This problem is NP-complete for general graphs. In this paper, the same problem restricted to a class of graphs called circular-arc graphs are considered. In particular, an O(n^2) time algorithm is designed using a dynamic programming approach to compute the Roman domination number of circular arc graphs, one of the non-tree type graph classes. Also, we have obtained the bounds of $gamma_R(G)$ for circular-arc graphs.
    Keywords: Design of algorithms; Circular-arc graph; Domination; Roman domination.
    DOI: 10.1504/IJAIP.2018.10020075
  • SEA2: Semantic Extractor, Aligner and Annotator - A Framework for Automatic Deep Web Data Extraction, Alignment and Annotation based on Semantics   Order a copy of this article
    by UMAMAGESWARI BASKARAN, Kalpana Ramanujam 
    Abstract: Nowadays huge number of web databases is accessible through front-end search query forms. The data records returned are embedded within HTML templates and returned to the end-user in the form of web pages. These web pages are dynamically generated and are not indexed to search engines. Therefore, they are referred as Deep web pages. They are intended for human understanding whereas they make automated processing difficult. In order to enable machine processing, as needed by many data analytics applications such as business intelligence, product intelligence etc., the data records embedded in those deep web pages has to be extracted and annotated. This paper proposes an automated solution based on inferred semantic rules to perform extraction and annotation of structured data records from Deep web pages. Experimental result shows that the use of domain knowledge in the form of inferred semantic rules improves the accuracy of deep web data extraction process.
    Keywords: deep web; web database; HTML templates; web data extraction; annotation; server-side templates; DOM tree; semantic labeling; hidden web; surface web.

  • An improved algorithm for detecting overlapping communities in social network   Order a copy of this article
    by MEHJABIN KHATOON, W. Aisha Banu 
    Abstract: Social networks or complex networks, contains hidden communities- which used to have some structure and the effort to discover those structures of the communities is a significant step in analyzing the large-scale structure of complex networks. Till now many algorithms have been developed for the detection of those hidden communities inside the complex networks. Community detection algorithms results in either detecting the partitions of the network i.e. non-overlapping communities or detecting the covers of the node i.e. overlapping communities. In this paper an algorithm for detecting the overlapping communities has been proposed. The proposed algorithm has been compared with other community detection algorithms based on various functional metrics like modularity, conductance, assortativity and centrality values of the formed overlapped communities has been compared with the whole network . The data sets have been collected from one of the most social network i.e. from Facebook of different areas, i.e. from politics, from shopping sites i.e. of Amazon and Flipkart.. The proposed algorithm is semi-supervised algorithm and it can be applied to networks of huge number of nodes i.e. of around 1000 nodes. The proposed approach can detect the individuals in the social network who sometimes belongs to more than one community.
    Keywords: centrality; modularity; overlapping community; social network; community detection.

  • Openflow Groups Based Fast Failover Mechanism for Software Defined Networks (SDN)   Order a copy of this article
    by Harish Sekar, Shriram K Vasudevan 
    Abstract: Software-defined networking (SDN contains several sorts of network technology which is used to design, build and manage networks. The abstraction of lower-level functionality in SDN enables network administrators to handle the network services. We have dealt with the link fault tolerance issue using SDN. Link fault tolerance has been handled so far either by using the protection or the restoration scheme. We have proposed a hybrid scheme of both the techniques and per-link Bidirectional forwarding detection sessions are applied for each links and handled the problem accordingly. Our method also ensures that the transfer of control to the controller from the switch does not take place unnecessarily. This has been done with a target of reducing the overall response time for link resiliency. Therefore these techniques make sure that the link failure is handled with Software Defined Network (SDN). Our methods have been compared with the traditional scheduling and link fast failover methods to prove by results, that it can handle scheduling and recovery with a better response time
    Keywords: OpenFlow; SDN; Managing Networks; Scheduling; Link Failure; Recovery.

  • A Supervised Multinomial Classification Framework for Emotion Recognition in Textual Social Data   Order a copy of this article
    by Abid Hussain Wani, Rana Hashmy 
    Abstract: The task of emotion recognition from text has received much attention since the proliferation of online social networking which has woven itself into the fabric of lives of people world-over. This study is aimed at extracting the lexical and contextual information from the text and combining it with semantic information for the detection of the emotional state of a sentence. We propose a supervised framework for recognition of emotions from text in this work. Our framework utilizes word embeddings from Word2Vec to extract the set of words which fall in semantic proximity of an affect-bearing word and also takes into account the context in which the words are used. We incorporate new class-specific emoticon features in all our experiments as emoticons are commonly used on social media platforms. As the nature of social media text is generally very informal and has an irregular structure, our framework encompasses an appropriate mechanism to handle it. We evaluate our Support Vector Machine-based framework on Stance Sentiment Emotion Corpus (SSEC) and Aman's dataset. The classification results achieved are better than state of art techniques currently available.
    Keywords: Emotion Detection; Emoticon mapping; Supervised Learning; Social Media Analysis.
    DOI: 10.1504/IJAIP.2018.10027081
  • Human Action Recognition Using Spatio-Temporal Skeletal Data   Order a copy of this article
    by Awadhesh Kumar Srivastava, K.K. Biswas 
    Abstract: Human action recognition from video is an important task with multiple challenges like cluttered background, luminance, occlusions etc. Availability of depth sensor like Kinect makes the action recognition task a bit easy but it brings new challenges in terms of computation cost and noise. We present a novel, computationally economical but effective method for human activity recognition using skeleton data. We consider the relative changes in body parts positions to recognize the activity in the video and propose sum of temporal differences of Joint-Pair-distances (STD) as feature descriptors. Further, we show that using random forest as a classifier with these features, can produce better accuracies compared to various recent state of the art methods. We establish this by experimenting with publicly available MSR-action 3D dataset and MSR-Daily Activity datasets. The results show that proposed method archives accuracies of 93.9% in former dataset while 87% in latter dataset.
    Keywords: surveillance; tracking; RGB video; human gesture; activity recognition; depth data; skeleton data; MS-kinect;.
    DOI: 10.1504/IJAIP.2018.10023078
  • Efficient Wastewater Discharge Location Speculation System based on Ensemble Classification   Order a copy of this article
    by Brintha Malar C., S. Akilandeswari 
    Abstract: Water pollution is one of the serious threats to the society, as water is the primary need of every organism thriving on earth. It is necessary to control and detect water pollution by assessing the quality of water. However, the production of wastewater is always there and is inevitable. Hence, it is equally important to treat the wastewater in a better way, such that the environment is not affected. The pollution control board has formulated certain standards, which provides the range of values for each pollutant and the feasible discharge locations. Taking these standards as the input for training the system, this work extracts basic statistical features such as mean, standard deviation, entropy and variance for training the classification system. The ensemble classification is incorporated, which includes k-Nearest Neighbour (k-NN), Support Vector Machine (SVM) and Extreme Learning Machine (ELM). The performance of the proposed approach is evaluated in terms of accuracy, sensitivity and specificity. The results of the proposed approach are found to be satisfactory.
    Keywords: water pollution; ensemble classification; wastewater discharge.

  • TODIM based Pythagorean fuzzy multicriteria group decision making through similarity measure   Order a copy of this article
    by Biswajit Sarkar, Animesh Biswas 
    Abstract: This paper presents a TODIM (an acronym in Portuguese for interactive and multicriteria decision making) method to solve multicriteria group decision making problems under possibilistic uncertainty using Pythagorean fuzzy similarity measure. At first, the weighted Hamming distance and Hausdo ̈rff distance for two Pythagorean fuzzy sets are defined. A new Pythagorean fuzzy similarity measure based on combination of the weighted Hamming distance and Hausdo ̈rff metric are proposed to measure the relative degree of dominance between the alternatives in TODIM context. Then, overall values of the alternatives corresponding to each decision maker are evaluated using TODIM technique. Finally, generalized mean aggregation operator is used to calculate final decision value for identifying best alternative. A numerical example is considered and solved to establish the usefulness of the proposed method and the achieved solutions are compared with the existing methods in Pythagorean fuzzy context.
    Keywords: Multicriteria group decision making; Prospect theory; Pythagorean fuzzy sets; Similarity measures; TODIM.

  • Rice Plant Disease, Crop Stages Endemic And Control Measures - A Survey.   Order a copy of this article
    by T. Gayathri Devi, P. Neelamegam, A. Srinivasan 
    Abstract: Plant diseases are the main cause of economic losses in the agricultural field. In India, the most significant staple food consumer by rice, but different kinds of disease of rice namely grain discoloration, sheath blight, brown spot, and blast are the main restriction on rice harvesting or production and is explosion being the most demoralizing disease. Thus, in this survey, provides the main research finding in terms of rice plant protection against different kinds of diseases and their main reasons. And, this survey additionally present, the infection stages and the methods of control measure which reduces the spreading of diseases.
    Keywords: Oryza sativa; Rice production; rice plant diseases; disease infection; Indian Rice Production.
    DOI: 10.1504/IJAIP.2021.10026406
  • Optimal Allocation of Multiple FACTS Devices Considering Power Generation Pricing for Optimal Reactive Power Dispatch Using Kinetic Gas Molecule Optimization   Order a copy of this article
    by Pradeep Panthagani, R. Srinivasa Rao 
    Abstract: Optimal Reactive power dispatch (OPRD) is a very immense issue in a power system, which is a complicated non-linear optimization issues with a combination of discrete and continuous control variables. Optimization techniques are playing a major role in providing more effective result for such complications. An efficient optimization technique called Kinetic Gas Molecule Optimization (KGMO) is proposed for solving multi-Objective OPRD (MOPRD) problem in this work. Regarding this three major FACTS (Flexible AC Transmission System) devices such as devices like Static Var Compensator (SVC), Thyristor controlled Series Compensator (TCSC) and Unified Power Flow Controller (UPFC) are optimally allocated in the test system. Since the cost of system increases considerably with these devices, a novel approach of considering Power Generation Pricing in MOPRD is done in this work. For this purpose, KGMO is applied along with Pareto optimality (PO) concept, which gave considerably superior results compared to conventional results. This is implemented and tested in IEEE 30 bus system considering multiple objectives.
    Keywords: FACTS (Flexible AC Transmission System); KGMO (Kinetic Gas Molecule Optimization); ORPD (Optimal Reactive Power Dispatch); Multi-objective ORPD (MORPD); Static Var Compensator (SVC); Thyristor controlled Series Compensator (TCSC); UPFC (Unified Power Flow Controller);.
    DOI: 10.1504/IJAIP.2018.10024248
  • Effective Statistical Texture Features for Segmenting Mammogram Images Based on M-ARKFCM with Multi-ROI Segmentation Method   Order a copy of this article
    by Ramayanam Suresh, A. Nagaraja Rao, B. Eswara Reddy 
    Abstract: Mammogram segmentation using multi-region of interest is one of the most emerging research areas in the field of medical image analysis. The steps involved in the research are classified into two types: 1) segmentation of mammogram images 2) extraction of texture features from mammogram images. In recent years, the mammogram segmentation systems become welldeveloped, but still Feature Extraction (FE) algorithms are facing problems like poor outcome in severe lighting variations, illuminance, etc. In order to overcome these difficulties, an effective methodology is proposed in this paper, which consists of three stages. In the first stage, mammogram images from the Mammographic Image Analysis Society (MIAS) dataset is enhanced using Laplacian filtering. Then, the preprocessed mammogram images are used for segmentation using Modified Adaptively Regularized Kernel based Fuzzy C Means (MARKFCM). After segmentation, Statistical texture FE is applied for distinguishing the patterns of cancer and non-cancer regions in mammogram images. Finally, the experimental outcome shows that the proposed approach improves the segmentation efficiency by means of statistical parameters compared to the existing methodologies.
    Keywords: Image segmentation; mammographic image analysis society; modifiedadaptively regularized kernel-based fuzzy c means; texture features;.
    DOI: 10.1504/IJAIP.2018.10021290
  • A Review of Different Techniques Used for Routing in wireless Sensor Networks   Order a copy of this article
    by Tanaji Dhaigude, Latha Parthiban 
    Abstract: The aim of this paper is to summarize all information related to routing in wireless sensor networks (WSNs). The available routing techniques are divided in 3 ways: hierarchical, flat and location based routing. WSNs are made of small nodes, wireless communication capabilities and computation. Various researchers are working on the technique which can be a combination of effective routing and optimized energy consumption. From last two decades this has been issue to improve the power usage of the router used in WSNs. The energy issue can be addressed by using different protocols in WSNs router. Currently available protocols are: query based, multipath, QoS based, Negotiation based and coherent based. In this paper each routing technique is discussed in great detail with their advantages and disadvantages. Authors also have highlighted the future area of research.
    Keywords: wireless sensor networks; Fault Tolerance; Scalability; Energy consumption.

  • Music mood Taxonomy Generation and classification of Christian Kokborok song: An audio based approach   Order a copy of this article
    by Sanchali Das, M. Prakash, Saroj K. Rajak, Swaapan Debbarma 
    Abstract: Music information retrieval (MIR) is a growing field of research and the field of music mood classification which represents the relationship between human emotion and music is also a growing field of research. Various works have been done in MIR research area with western music. In the recent decades, Hindi music mood classification also has also been done. But in MIR, the work is limited to some language only. We are trying to focus an under-resourced and less privileged language like Kokborok. Kokborok is a native language of Twiprasa or borok people in the north eastern state of India. It is spoken in the state of Assam, Mizoram, Manipur and other countries like Bangladesh also. Still, no research work has been done for Kokborok in music mood classification research. So, we have developed an audio-based model for classifying modes of Kokborok song using different prominent features like Rhythm, Intensity, and Timbre. Decision tree Classifier j48 is used for the classification purpose. Our dataset composed of 125 songs and clips of 30 second is being used to build the computational model that consists of 4 different mood clusters, and each cluster has three subclasses. We have achieved 54.4 % accuracy rate for music mood classification on the above data using weka machine learning tool.
    Keywords: Kokborok Christian song; Decision Tree-J48; Mood taxonomy; Hevners adjective.
    DOI: 10.1504/IJAIP.2018.10020901
  • Content Based Fabric Image Retrieval System by Exploiting Dictionary Learning Approach   Order a copy of this article
    by Jasperline Thangaraj, Gnanadurai D 
    Abstract: Owing to the skyrocketing growth of the image utilization, it is necessary to organize the images by some means. CBIR is the system that matches the query image with the image database and fetches the relevant images with respect to the query image. This makes the image search process easier and it has found many applications in almost all domains. The main issues of a CBIR system are the accuracy and time consumption. This work presents a Content Based Fabric Image Retrieval System (CBFIR) which relies on the extraction of colour and texture features. The initial clusters are built by the Fuzzy C Means (FCM) algorithm and the dictionaries are constructed for every cluster. The clusters of each dictionary are updated by Simultaneous Orthogonal Matching Pursuit (SOMP) algorithm. The proposed approach compares the test image with the constructed dictionaries, so as to detect the dictionary with sparsest representation. The performance of the proposed approach is observed to be satisfactory in terms of accuracy, precision and recall rates.
    Keywords: Fabric image retrieval; image clustering; feature extraction; dictionary learning.

  • An Energy-Efficient Ensemble Clustering Based on Multiple Disjoint and Non Linear Structures for Wireless Sensor Network   Order a copy of this article
    by Sheeja Rani, Siva Sankar 
    Abstract: The evolution of small-scale, cost effective and intelligent sensors with effective communication capabilities between users has instigated the emergence of wireless sensor networks (WSNs). As the energy of each sensor node in WSN is generally restricted, efficient use of energy is considered to be the foremost problem to be addressed. Besides, with larger sensor nodes in WSN, load balancing is regarded as the second issue to be addressed. Different clustering algorithms produce different partitions because it imposes different structure on the data. Hence, the performance of single clustering algorithm is not sufficient. For these cases, ensemble clustering through non-linearity becomes an interesting alternative. Here, an ensemble of Normalized Spectral Cluster and Separation K Means clustering algorithm is employed. The new framework called Normalized Spectral Cluster and Separation K Means (NSC-SKM). NSC-SKM determines optimal clusters in a network. In fact, the framework uses the eigenvector corresponding to the subsequent infinitesimal eigenvalue of the Laplacian, therefore achieving balance between sensor nodes in the cluster. As a result, multiple disjointed issues is said to be addressed, ensuring load balancing. Then an enhancement approach is investigated to minimize the energy consumption and improves the network lifetime, through non-linear structures. It has been achieved by mapping the sensor nodes non-linearly into a higher-dimensional feature space via a separation function. The energy and load performance of NSC-SKM framework is studied through an extensive simulation experiments which demonstrate the appeal of the proposed framework through significant performance gains compared to baseline solutions.
    Keywords: Wireless Sensor Networks; Load Balancing; Normalized Spectral Cluster; Separation K Means; Graph Theory; Similarity Matrix; Eigen value; Laplacian.

  • Asymmetric Enciphering of Images using Affine Transform and Fractional Fourier Transform   Order a copy of this article
    by Savita Anjana, Indu Saini, Phool Singh, Anil Kumar Yadav 
    Abstract: This paper presents an asymmetric enciphering technique for binaryrnand grayscale images that uses affine transform with amplitude and phaserntruncation operation in fractional Fourier domain. Affine transform is used tornintroduce randomness for additional security, and to resist specific attackrnrecently mounted on asymmetric schemes. The scheme is validated for binaryrnand grayscale images in MATLAB. The affine transform parameter and thernorders of fractional Fourier transform serve as encryption keys in addition torntwo private keys of asymmetric cryptosystem. The scheme has been tested forrnits sensitivity to these parameters. The scheme is also evaluated for itsrnrobustness against the occlusion and noise attacks. Other usual attacks and the specific attack on the scheme are also discussed
    Keywords: Asymmetric cryptosystem; affine transform; fractional Fourierrntransform; binary and grayscale images.
    DOI: 10.1504/IJAIP.2021.10026407
  • Fuzzy based Efficient Task Scheduling Scheme (ETSS) on Heterogeneous Multicore Processor   Order a copy of this article
    by K. Indragandhi, P.K. Jawahar 
    Abstract: Scheduling of tasks based on real-time requirement is a major concern in heterogeneous multicore processor. A heterogeneous multicore processor contains cores with different capabilities. A scheduler plays a major role in scheduling a task among the cores which will improve the performance by efficiently utilizing the resources of each core. The main proposal of this work is to implement fuzzy logic based efficient task scheduling scheme for the non-periodic tasks on soft real time heterogeneous multicore processor. The proposed work describes two fuzzy logic based scheduling schemes namely Dynamic Priority Generator (DPG) and Efficient Task Scheduling Scheme (ETSS). Tasks having different execution times and different deadlines are assumed. DPG algorithm assigns priority based on execution time and a deadline of the task. In ETSS, the task which has been assigned higher priority by DPG gets scheduled for execution in a high performance core and the remaining low priority tasks get scheduled in low-performance cores. The objective of these algorithms is to increase the CPU core utilization, performance and throughput. Simulink model was designed to implement the two fuzzy based algorithms. Test cases were generated for different task execution times and deadlines to evaluate the proposed system. Results shows that CPU utilization of core 2 is 56.7 % for the first set of membership function and core 3 is 49.4% for the second set of membership function among the heterogeneous quad core .
    Keywords: Heterogeneous; Multicore Processor; Fuzzy Interference System; Task scheduling; Membership function.
    DOI: 10.1504/IJAIP.2018.10025677
  • Certain properties of interval-valued intuitionistic fuzzy graph   Order a copy of this article
    by Hossein Rashmanlou, Ali Asghar Talebi, Seyed Hossein Sadati 
    Abstract: The basis of the concept of interval valued intuitionistic fuzzy sets was introduced byrnK. Atanassov. Interval valued intuitionistic models provide more precision, rnexibility, andrncompatibility to a system than do classic fuzzy mode. In this paper, we have introducedrncertain concepts of covering, matching and paired domination using strong arcs or e ectivernedge in IVIFGs with suitable examples. We investigated some properties of them. Also,rnwe have calculated strong node covering number, strong independent number and otherrnparameters of complete and complete bipartite IVIFGs.
    Keywords: Interval-valued intuitionistic fuzzy gragh; strong arcs (e ective edge); covering,rnmatching; paired domination.

  • Estimating the Perspicacious features of ECG Recording Based on Template Classification for detecting Atrial Fibrillation   Order a copy of this article
    by V.R. Vimal, P. Anandan, V. Induja 
    Abstract: Atrial Fibrillation (AF) is the most extreme basic managed cardiovascular arrhythmia, happening in 1-2% of the overall public and is related with generous demise. It is likewise in charge of 15% to 20% of all strokes. The avaliable AF distinguishing techniques are here and there unfit to segregate AF from some different arrhythmias and may misclassify other unpredictable rhythms or uproarious ECGs as AF, bringing about false cautions. The focal point of our exploration work is to build up a calculation to recognize AF with high precision, vigor to commotion, and low false caution rate. Since AF influences both the heart rate inconstancy and ECG morphology, the proposed strategy joins characterization in view of heart rate fluctuation highlights and layouts of the ECG waveforms. The idea of Compressive Sensing essentially called CS that has starting late been associated as a low multifaceted nature weight structure for whole deal seeing of electrocardiogram signals using Wireless Body Sensor Networks. This strategy keeps an eye on the issue of estimating heartbeat rate in compressive recognizing electrocardiogram (ECG) accounts, keeping up a vital separation from the redoing with whole signal. Methodologies considered a structure in which ECG signals are addressed under the kind of CS straight estimations. Then QRS territories are evaluated from compacted space by handling the relationship with pressed ECG ,based on QRS design.rnrn
    Keywords: Wireless body sensor networks; Atrial fibrillation; Compressive Sensing.
    DOI: 10.1504/IJAIP.2018.10025790
  • Hybrid privacy preservation model for big data publishing on cloud   Order a copy of this article
    by Suman Madan, Puneet Goswami 
    Abstract: Cloud computing has gained popularity since the user can share a large amount of data through the cloud servers. Cloud servers provide services to a large number of users simultaneously. The third party users access the information provided by the various users of the cloud network. While sharing the information of the licensed user to the third party, user privacy and the utility of the information need to be maintained. Various techniques have been introduced for increasing the privacy of the information. This work introduces an anonymization based technique for the privacy preservation of the data before publishing it in the cloud environment. This paper uses the k-anonymization criteria for duplicating k-records in the information database. The proposed preservation model uses the dragonfly (DF) algorithm for achieving the k-anonymization database. The proposed work defines a fitness function for achieving maximum privacy and utility during the process. The performance of the proposed work is analyzed with the metrics such as classification accuracy and information loss. Various comparative methods are used for analysing the performance of the proposed work. The simulation results shows that the proposed privacy preservation model with the k-anonymization and the DF has the better values for the classification accuracy and the information loss.
    Keywords: Big data; cloud computing; privacy preservation; k-anonymization; dragonfly algorithm.
    DOI: 10.1504/IJAIP.2018.10025582
  • Classification and Comparison of IP Traceback Techniques for DoS/DDoS/DRDoS Defense   Order a copy of this article
    by Marjan Kuchaki Rafsanjani, Hashem Bagheri Nezhad 
    Abstract: The invention of the Internet environment has increased the speed of data transmission, however, the attacks in this environment are growing exponentially. Furthermore, identifying the source of the attacks is very difficult due to the vibrant and anonymous nature of the Internet. Denial-of-Service (DoS) attacks are one type of the attacks in this environment that can be done in many forms. Denial-of-Service (DoS) or Distributed-Denial-of-Service (DDoS) or Distributed-Reflector-Denial-of-Service (DRDoS) attacks try to saturate the victim network servers (with external requests) and also they try to disable its resource for its lawful users. IP tracback is the ability to identify the source of this type of attacks. Thus, IP traceback is an important step to defense against these types of the attacks. Many IP traceback schemes have been presented till now. In this article, we review several presented schemes in the recent decade and we compare these methods against the predefined metrics that help the researchers to discover the gaps to perform the further research in this area.
    Keywords: IP traceback; Traceback schemes; Packet marking; Packet logging; DoS/DDoS/DRDoS defense;.

  • Modeling and analysis of Space Vector Pulse Width Modulated Inverter Drives System Using Matlab Simulink   Order a copy of this article
    by Shanmugasundaram Nithaiyan, Pradeep Kumar Sivakumar, Ganesh E.N 
    Abstract: This paper presents high frequency cable modeling, analysis of cable. Advantages of dodecagonal space vector switching and two level inverters are achieved with a constant input DC supply.An improved high-frequency power cable is connected between drive and induction motor to measure the harmonic distortion and reflected overshoot voltage at the motor terminal. The results have confirmed that the present method was very high frequency and time domain. This model used to predict the effect of long cables for any space vector pulse width modulation technique (SWPWM), the six-step two level voltage inverter fed induction machines. By comparing the harmonic distortion of the SVPWM can be estimate from low switching to high frequency switching. For different range of frequencies, the cable equivalent circuits are developed and the same has been implemented and the results are shown in Matlab/Simulink
    Keywords: cable modeling; frequency Drive; Cable parameters; space vector modulation; Induction motor drives.

  • A Comparative Study and Implementation of Neuro-Fuzzy and Decision Tree for Malignant Tumor Detection System   Order a copy of this article
    by Sanjeev Kumar, Rajesh Kumar Maurya, Sanjay Kumar Yadav, Baij Nath Kaushik 
    Abstract: Breast cancer is one of the most chronicle diseases found in the woman. There are two types of tumor in a breast named as Malignant and benign. A Patient who has more percentage of malignant tumors is suffering from breast cancer. A model based on Neuro-Fuzzy is proposed to classify the tumor as malignant or benign. The designed system works on various attributes of tumor like tumor thickness, shape, size etc. The classification process completes in three phases; the phase 1 classifies the attributes as cat1 or cat2 on the basis of information gain. Then in phase 2 cat1 attributes are used to select the class of tumor by using the Radial Bias Function neural network while the cat2 attributes uses the fuzzy to select the class of tumor. The results of both techniques are collaborated by using the fuzzy inference system in the phase 3. The effectiveness of the technique is easily identified by using results. The results are compared for the accuracy of cancer detection of cat1 and cat2 with Neuro Fuzzy system and Decision tree.
    Keywords: Breast cancer; malignant; benign; tumor; RBF; fuzzy; Decision tree.
    DOI: 10.1504/IJAIP.2019.10026262
  • A novel approach to select cluster head and optimising the data transmission in VANET   Order a copy of this article
    by V. Gokula Krishnan, N. Sankar Ram 
    Abstract: Communication between the vehicles has a major role in Intelligent Transport System (ITS). Vehicular Ad-hoc Networks (VANET) is developed to improve the road safety. Data transmission between the vehicles has the major challenge in VANET, because VANET has high mobility and dynamic topology. In this paper, we proposed the Cluster based approach to improve the data transmission. In this system a novel approach is proposed for choosing the Cluster Head (CH). The CH selection algorithm is done in three levels, first level normal CH selection, and second level is cluster leaving and finally cluster merging. After choosing the CH, the routing path to be established based on the link quality which is measure using the link life time. The proposed system is implemented using the SUMO and NS3 tools.rn
    Keywords: Vehicle Ad-hoc Networks; Intelligent Transport System; Cluster Head Selection.
    DOI: 10.1504/IJAIP.2018.10020599
  • Statistical Analysis of EMG & GSR Biofeedback Efficacy On Different Modes for Chronic TTH on Various Indicators   Order a copy of this article
    by Rohit Rastogi, D.K. Chaturvedi, Santosh Satya, Navneet Arora, Vikash Yadav, Vishwas Yadav, Pallavi Sharma, Sumit Chauhan 
    Abstract: Biofeedback therapy has been popular for a long time in treating problems stress and TTH. The Tension type headache is very common in our life, if it appears frequently and with increased frequency then it is called as distress. The physiological responses to TTH based stress may differ with regards to acute and chronic stress. Stress related disorders are often termed as psychosomatic disorders which involves the mind and body. These are the disorders in which the mind makes the body vulnerable for tousled. Tension-type headache (TTH) is prevailing and chief diseases in psychosomatic medicine because of its correlation with psychosocial factors. Prime Negative effect induced in ones life is results of events like deaths of close family members, surgery and divorce. In past years, such incidents have been passably related to attribute of headache. Biofeedback therapies include the use of non-pharmacological treatments that utilize scientific instruments to evaluate, amplify, and feedback physiological information to the patients being monitored, thereby promoting control and manipulation of physiological parameters. No doubt, everyone will accept the key role or researchers in academic industrialization and colloquialization of research. There is a vast increment in the research methods in enterprises and industries due to commercialization of academic process and its related organizations. Now due to availability of Internet at affordable rates to everybody it is seen that its boon and curse as knowledge of different aspects are available to individual rather than public parties like earlier. Many tools for the research purposes are being created cross checked by validation smart techniques and academy and industries are in race to get the patents for them so that they can create another mean for money making and business in scientific creations. Many developers are working tirelessly to create the tools as per these company specifications. No doubt the SF-36 score and related questionnaire form is much popular and so the organizations with the trust for Medical outcomes, Quality metric designing, lab for assessing the Health standards, incorporation of Optum, have prepared the similar policies for usage of SF36 data globally. These corporations provide the License agreements for individual scholars for commercial and research purposes and do evaluation of data, responses consistence and consistency internally and they also provide for ease the scoring and interpretations of data for the general users and researchers. The SF-36 score can also be published on internet to enlarged its reach. The data collected can be analyzed with the help of Big Data by any organization. Multimedia technology has shifted the approach we gaze at computers. The initial computers were considered as single-purpose machines that could only solve any complex mathematical problem. In 1960, Huge corporate database was managed by Main frame Computers. It has been observed that the use of devices related to Multimedia increased at an exponential rate in all formats audio, visual or audio-visual). Billion-dollar market is set on the bottom of Electronic Gadgets. For technocrats, the new challenge is to handle the huge generation of data, which is created per mill second. When Health related applications are considered, MMBD is more efficient than IoT as development activities are focused on scalar sensor data. Complexity and unique nature of MMBD increments the application of IOT by using Biofeedback sensors. This paper deals with data related to MMBD, like Reliability and scalability, accessibility, heterogeneity, and Quality of Services for mental health. Present paper is the results of conclusion of two therapies on EMG and GSR on audio, visual and audio-visual mode for TTH. It was conducted for 12 months. Physical, mental and total score were calculated in this experiment. Authors have tried best to verify which therapy is more efficient in audio-visual modes of EMG and GSR comparison among different scores. Background and Purpose Nearly 90% of all headaches are subset of Tension Type Headache (TTH). Efficacy of electromyography (EMG) biofeedback (BF) has been extensively studied on tension type headache and is proved with the help of Big data. However, insufficient amount of study can be spot on accuracy of galvanic skin resistance (GSR) biofeedback. So far, there are no studies on the accuracy of scheduled visual, audio and combined EMG or GSR- Biofeedback TTH.
    Keywords: Stress; EMG; GSR; Spirituality; Mental Health; Meditation; TTH; EMG and GSR Biofeedback; Audio; Visual; SF36; Mental and physical Scores.
    DOI: 10.1504/IJAIP.2019.10021825
  • New Concepts in Intuitionistic Fuzzy Labeling Graphs   Order a copy of this article
    by Hossein Rashmanlou, M. Devi 
    Abstract: In this Paper, the concept of Intuitionistic Fuzzy Labeling (IFL) has been introduced. Intuitionistic Fuzzy Bi-magic Labeling (IFBL) and Intuitionistic Fuzzy Anti-magic Labeling (IFAL) have been discussed for cycle, star and path graph. Intuitionistic fuzzy sets will be given both degree of membership and degree of non-membership , which are independent of each other to the extent, that their sum is not greater than one. Various types of arcs have been analysed and blank arc has been introduced and also discussed for Intuitionistic fuzzy labeling. We also discussed about Intuitionistic fuzzy bridge and Intuitionistic fuzzy cut-vertices with examples and their properties are studied.
    Keywords: Intuitionistic fuzzy graph (IFG); Intuitionistic fuzzy Bi-magic labeling; Intuitionistic fuzzy Anti-magic labeling; Intuitionistic fuzzy bridge; Intuitionistic fuzzy cut-vertices.

  • Abnormality Identification of Breast Mammogram Image Segmentation with Iterative Restricted Mode Algorithm   Order a copy of this article
    by Nagiii Reddyyy 
    Abstract: The breast Image division is a regular issue in medicinal picture preparing. For the scientist to remove the data with great determination without loss of points of interest. In this paper, we propose a division strategy by utilizing the iterative restricted mode (IRM) calculation and Markov Random field (MRF) model to recognize the variation from the norm in mammogram pictures. For all cycles the most reduced vitality name making is permitted by IRM. This strategy takes after the high compacted connection between limit name MRFs. In this model is tried with 5 pictures and assessment is done utilizing target assessment criteria, namely the Jaccard coefficient (JC) and Volumetric Similarity (VS) and Variation Of Information (VOI). Global Consistency Error (GCE) and Probability Rand Index (PRI). By utilizing Image quality measurements the execution assessment of divided pictures likewise assessed. The reproduced comes about proposed by utilizing T1 weighted pictures are contrasted and the current model.
    Keywords: MRI method; repetitive mode; Markov Random field; segmentation of Images; Kernel; Quality metrics.

  • LSTM based Statistical framework for human activity recognition using mobile sensor data   Order a copy of this article
    by Krishna Kishore 
    Abstract: Fall detection plays an essential part in the healthcare monitoring system and also helps the elderly and disabled people. Advances in supporting technologies have pushed researchers to focus on activity recognition to improve the quality of needy people in their emergency. In this paper, authors have proposed a methodology for an efficient detection of automatic fall, which can perceive every possible fall event by using Human Activity Recognition (HAR). Eigen features and some other time related features are computed to the data, collected from the sensors associated with Android based mobile devices. These features will be analyzed to classify physical activities such as fall, walking, sitting, upstairs, downstairs, jogging, etc. In this work, Long Short Term Memory (LSTM) Neural Network is used to classify human activities. Based on this, alerts will be generated in case of fall detection, otherwise data will be archived for the future references. Performance of the proposed framework is evaluated on two benchmark datasets (WISDOM and UCI) and one real time tracked dataset. The accuracy of the proposed framework on tracked dataset is 91.48% and outperforms other classifiers
    Keywords: Fall detection; Smartphone Sensors; Activities of Daily Living (ADL); Human Activity Recognition; Long Short-Term Memory Classification; etc.

  • A Survey on Wireless Networks to Balancing the Load in Wireless Mesh   Order a copy of this article
    by Subba Rao 
    Abstract: Now a days wireless technology occupy the very prominent role in all the sectors and WMNs have play the prominent role in coming generations because it has many benefits over other wireless networks. However, still there are many technical issues are there which are going to discussed in WMN. The major problem of WMNs is sustained the balancing. In WMNs, comparison between the incoming data traffic to a node is greater than the outgoing data traffic, then congestion is high in this network. Various authors have proposed to reduce the congestion and for improving the network throughput. This paper discussed about analysis of various load balancing techniques to facilitate the researchers as well as practitioners in choosing a proper load balancing technique for improving the network performance.
    Keywords: Wireless mesh network;Gateway;Load balance;Path;Router;NS2; etc.

  • Prediction and Detection of Kidney Diseases using Ensemble Classification   Order a copy of this article
    by Suresh Babu 
    Abstract: Chronic Kidney Disease (CKD) is the most dangerous disease occurred in many of the persons now days. As the research is going on there is exact reason for damaging of kidneys. Some research says that this may be due to the lack of drinking water and high blood pressure and Diabetes and some of the diseases such as severe dehydration, kidney trauma. After all the research done, in this paper, the new prediction and detection of chronic kidney disease using ensemble data mining classification is described for the better results.
    Keywords: CKD; Diabetes; Diabetes.

  • Towards Smart Healthcare System in Airlines with IoT and Cloud Computing   Order a copy of this article
    by Veeraa Anjaneyuluu 
    Abstract: The world is becoming smart with the advances brought in the new technologies. In the recent past, Internet of Things(IoT) is playing major role in all the fields of the world. Smart health care application are developing with many accepts and Cloud Computing also effective part for data communication around the world. Detection and controlling of contagious diseases is also a major issue, when people travel all over the world in airways. In this paper we propose architecture to smart identification of the person/s with diseases while travelling in airlines. Adoption of this architecture able to control and stop the pervasive healthcare. An effective ways are discussed and to determine the percentage of infection of particular disease using probability tables. Based on our concept and results we are also given directions to development of tools and applications.
    Keywords: Internet of Things; Cloud Computing; Contagious Diseases; Airlines System.

    by Jyothsna Devi 
    Abstract: Tumor is a mass of abnormal cells. The tumor that grows inside human skull is termed as brain tumor. Human brain is enclosed by the skull. Tumor that grows in such restricted space can cause problems. Tumors that grow in the brain are categorized as cancerous or noncancerous. If these types of tumors cannot be detected at their early stages may lead to brain damage, and it can be life-threatening. In this work we have used SVM to detect whether the given MRI image of a brain tumor is malignant or benign. Recent literature shows that Support Vector Machines (SVM) is a supervised classification technique that has increased popularity as they exhibit high generalization ability even trained with small set of training data. SVM has good generalization ability to solve many real-time problems. In this work we have used SVM based classifier to identify whether the tumor is malignant or benign. Initially hand crafted features like Discrete Wavelet Transform (DWT) or gist features are extracted from the given MR images, then follows preprocessing and segmentation tasks followed by SVM based classification. Each of these representations have their own advantages of representing images. If we consider any single representation then we are ignoring the advantages of using the other representation. Our proposed method tried to exploit the benefit from different representations of the images. Motivated from the fusion based classification models, in this work we have extracted different representations from the given MR images and fused them to represent the image as a single feature vector. We have applied different fusion techniques to improve the performance of the SVM based tumor classification. Our experimental studies on bench mark datasets show that fusion techniques can enhance the accuracy of SVM classification for brain tumor classification. Along with fusion we have also tried to examine the efficiency of various kernels on the classifiers performance.
    Keywords: Support Vector Machine; Otsu Segmentation; Discrete Wavelet Transform; Non local mean filter; fusion.

    by K. Priya, K. Dinakaran 
    Abstract: Now-a-days onine shopping by customers is getting increased day by day. Customers are having awareness in the case of buying products based on the features of the products. The features of the product may be model, colour, size, Durability or price.The customer reviews or feedbacks based on the price of the products are collected from three different shopping websites and then consolidated and also ranked under separate website.The customer will be buying the product based on lowest price ,which online shopping website is holding.Through this work customer can avoid confusion while shopping for products.Generic wrapper and VIPS techniques are used.These details can also be posted or shared in social networks webpages for customers convenience.Then the customers can maintain budjet and preventing them from taking wrong decision during online shopping.Searching Time for product information under each website can be reduced for the customers.
    Keywords: Generic wrapper; Wrapper Generation; product review; social network; Review Summarization; E-Commerce.
    DOI: 10.1504/IJAIP.2021.10030025
  • Combination of Machine Learning methods to Solve Cold start problem in Recommender system   Order a copy of this article
    by Nitin Mishra, Vimal Mishra 
    Abstract: Recommender systems are special type of intelligent systems which exploits historical of user rating on items to make recommendation of items for those users. They are used in wide range of applications like online shopping, E-Commerce services social networking applications and many more. These are also used in banks and other services. They can also be used in fault finding in critical systems. In our paper we are discussing a problem known as cold start problem where you new user has a problem as he has missing history. We have used clustering approach to cluster users and then using these cluster labels for supervised machine learning to solve the cold start problem of new users. We have validated our solution on movielens dataset and found it to be solving cold start problem in a magical way. so we are claiming our approach to be a novel approach for solving cold start problem using combination of several methods some of which belong to collaborative filtering domain and others belong to content based domain. We have done exhaustive check so that it could be our fault free solution. As our method predict certain items so the results are accurate to our domain and can vary with small amount in similar domains. But theoretically, our Method can be used for solving cold start problem in general in any domain.We also claim that our method performance becomes better with increasing value of N of TopN Recommendation.
    Keywords: Cold start problem;Recommender systems; Machine learning;classification;clustering;k-modes clustering.

  • Smart mirror with biometric authentication   Order a copy of this article
    by T. Vishal Patro, M. Gayathri, C. Malathy 
    Abstract: Smart mirror also called magic mirror is a wall-mounted mirror or a bathroom mirror which helps in displaying data along with the functionality of a normal mirror. It displays the plain texts, image, date, time and weather as well as recent news headlines, etc. It can also display mails, shopping list, important reminders which are present in someones mail account. Biometric authentication is used in this work to use the smart mirror displaying functionality. The statement everyone is unique becomes the basic premise for biometric authentication. It supports facial recognition technology for the authentication. So, only the authorized person will be able to use the display facility on the mirror as long as he/she is in front of the mirror. These type of smart mirrors can be used in bathroom or bedroom. In this model, the user is required to authenticate himself by Face detection biometric technique. This helps in saving lot of time in viewing any personal mail feeds or social app notifications using phone or laptop instead use that time wisely by doing to daily tasks in morning like brushing your teeth of washing your face simultaneously watching the smart mirror
    Keywords: Smart mirror; Authentication; Face recognition; Bio authentication; Magic mirror; Security; Raspberry pi.
    DOI: 10.1504/IJAIP.2021.10034759
  • Energy saving management system in educational buildings   Order a copy of this article
    by G. Sasikala, B. Roopesh, S.V. Charan Sai, Y. Sathish Kumar 
    Abstract: At present scenario, in the educational institutions the unnecessary consumption of electricity is observing at higher rates. This increase of consumption is reflecting in an increase in the cost of electric power. So, an efficient system is to be proposed such that this system activates the consumption of power whenever there is a need. As the first person enters all the electrical appliances switching on and then switching off as the last person exit. Later this automation can be implemented by using internet of things (IoT). So, this proposed concept is used to save the energy in educational buildings. In future it can develop to incorporate the system to many universities and software sectors to save and control the energy.
    Keywords: Home Appliances; Relays; Node MCU; ESP8266.
    DOI: 10.1504/IJAIP.2019.10024828
    by Ram Chandran.M, Vishnu Priya.R. 
    Abstract: Worldwide, agriculture is considered as the backbone of a countrys economy. The contribution of agriculture to Indias economy is steadily declining with Indias economic growth. Still, agriculture plays a significant role in the broadest economic sector. The major objective of the present work is to repel the pests that affect the agricultural fields especially paddy field. The pests that mainly affect the paddy fields are small insects, grasshoppers, and moths. Initially, we capture the images of the field and process these images using MATLAB software where we developed codes that find out whether the small insects are high or low. If the insects are high, then an appropriate pesticide is sprinkled over the field which is controlled by a PIC16F877A microcontroller. Further, we collected the frequency sensitivities of grasshoppers and moths to repel them. The time period of the day at which the grasshoppers and moths are maximum is obtained from previously published paper. Using this information we generate these frequencies at appropriate timings. As a result, the paddy field is free from grasshoppers and moths.
    Keywords: Frequency bandwidth; grasshoppers; image processing; moths; and pest control.

  • Machine Learning Approach to Predict Purchase Decision of Bank Products and Services   Order a copy of this article
    by Saumya Chaturvedi, Vimal Mishra 
    Abstract: We propose a machine learning approach to predict purchase decision of bank products and services. The data were collected from May 2008 to May 2014 of a Portuguese bank. This investigation will help to predict the business of the bank and financial inflation and recent trends in bank product and services. The investigation is focused on the classification and prediction of bank telemarketing calls for term deposit product. We have analyzed a large data set of 41188 observations related with bank client, product, services and socioeconomic attributes. Initially, the dataset was having 150 features and we have selected 21 most relevant features using standard adaptive forward selection and intelligence quotient. We have also compared four machine learning approaches Conditional Inference trees(Ctree), recursive partitioning (Rpart), Support Vector Machines(SVM) and RandomForest. The paper contains an impact analysis of changing training data set and training time of a model. Observatory study shows the integration of both parameters: accuracy and model learning time to form a generalized and optimized solution for predicting bank business.
    Keywords: Machine Learning;Business Intelligence;Data Mining;Decision support systems.

  • Optimized Feature Selection and Categorization of Medical Records with Multi Kernel Boosted Support Vector Machine   Order a copy of this article
    by Lakshmii Prasannaa 
    Abstract: With the fast growth of Internet and mobile usage, huge volumes of medical documents, which contain information of patients, diagnostic, past disease history and medication, are being generated electronically. In the field of text mining, document categorization has become one of the emerging techniques due to large volume of documents in the form of digital data. The main objective of the proposed work is to identify disease treatment relationships and predict the diseases among medical articles. In this paper, highly relevant and more correlated features have been extracted using Probabilistic Latent Dirichlet Allocation (P-LDA) and randomized iterative feature selection approach. These features were classified with Multi Kernel Boosted Support Vector Machine (MKB-SVM), and then their performance was evaluated on both PubMed and MEDLINE databases. Performance evaluation of the proposed approach on DB-1 and DB-2 was 98.7% and 92%, respectively. The evaluation illustrated that the proposed approach outperformed the existing state-of-the-art classification methods.
    Keywords: Allocation of Latent Dirichlet ; Classification of Medical Text; SVM classification; Ada-Boost.Multi-kernel.

  • Optimization of Sparse Linear Array Using State Transition Algorithm   Order a copy of this article
    by Pratistha Brahma, Banani Basu 
    Abstract: State Transition Algorithm (STA) has been used for sparse antenna array designing. A sparse linear array consisting of different core elements has been optimized using STA. The optimal solution is searched by changing the number of sparse elements and current excitation values of the core elements under a set of practical constraints. Number and position of sparse elements are optimized in order to achieve minimum Side Lobe Level (SLL) for a given Half Power Beam Width (HPBW) using various design examples. The paper has been studied the tradeoff between SLL and Directivity of the array for different numbers and positions of the sparse elements. Results obtained using STA has been statistically compared with that of the Particle Swarm Optimization (PSO) algorithm and Artificial Bee Colony (ABC) algorithm and ensures improved performances.
    Keywords: Sparse Antenna Array; STA; PSO; ABC; SLL; Directivity.

  • QOE-based recommender system in the applications of e-learning resources   Order a copy of this article
    by G. Senthil Kumar, C. Lakshmi 
    Abstract: Web services are an evolving and widely accepted technology in the industries and other management sectors. They are self-contained software systems that can be published and invoked through the web. Nowadays web services are more prevalent in the development of educational resources like Learning Management Systems(LMS), Massive Open Online Courses (MOOCs) and E-learning. The availability of these educational resources on the web provides a platform for obtaining and sharing knowledge to the users. In the current scenario of E-Learning platforms, selecting the right course according to the user requirements is a tiresome and time-consuming process because there are numerous courses are available on the web. In such situations, to judge the course user can rely on the Quality of service(QoS) descriptions given by the service provider at the time of registration of the course but the QoS descriptions are not always reliable. Also, the service providers can publish false and inaccurate information about the course to fascinate customers to get additional profits. Hence it leads to dissatisfaction to the users if the requirements are not fulfilled. In this paper, an alternative approach called Quality of Experience(QoE) adapted as a metric for arbitrating the courses. QoE is a subjective measurement which reflects user experience about the service and considered as a primary measure for judging the courses. The main objective of this paper is to describe how educational resources can be efficiently dealt with web services using Quality of Experience and to build an efficient E-learning recommender system in the application of E-Learning platform that guides the user to opt a course according to his requirements(i.e.) availability, cost, and reputation
    Keywords: E-learning; Learning Management Systems; Quality Of Service; Quality of Experience; Web Services.
    DOI: 10.1504/IJAIP.2019.10021026
  • VLSI Realization of an Efficient Image Scalar Using Vedic Mathematics   Order a copy of this article
    by V. Ramadevi, K. Manjunatha Chari 
    Abstract: A low-complexity algorithm using Vedic Mathematics is intended for VLSI realization of an efficient image scalar. The proposed scalar comprises of a modified area pixel interpolator, edge detector and Vedic Multiplier. To decrease the obscuring and aliasing effects created by the area-pixel model and to conserve the image edge features productively an edge catching method is embraced. Moreover, a Vedic division unit is utilized for enhancing execution of the scaling processor without any rounding error correction techniques. It additionally accomplishes advancement at all levels of digital systems reducing power consumption. The proposed architecture is capable to achieve 5.28-K gates count using 200 MHZ, and computation time is 14.37 ns synthesized by 0.13-μm CMOS technology. Through comparison with previous techniques, this work can reduce gate counts by 18% and want only a one-line-buffer memory.
    Keywords: Image scalar; line buffer; sharpening filter; Vedic mathematics; VLSI;.

    by Anudeep J, Kowshik G, Giridhararajan R, Shriram KV 
    Abstract: These days, storing money, gold and other valuables in the bank lockers has become a worrying aspect to the citizens all around the world. According to the statistics on the bank robberies and loots, India almost lost $27.9 million(180 crore rupees) only on loots and burglaries in past 3 years .And there are cases being noticed where the burglars attempted to loot the bank with a disguised costume of a nun on them so as to make the bank managers believe that they are the original owners of the locker. Apart from the incidents happened all around the world, improvisations done to do the lockers safeties every year was mostly found are of only in the mechanical way i.e., lockers were given strength by manipulating the materials used. But, unlike to all those works, we come up with a system that could effectively face these kinds of problems and could even log the data like time of access to locker, changes occurred in the weight of the locker etc., and increase the security of the bank lockers making the individuals feel much safer on their property.As there is no intervention of men it will be a more accurate and safer method.The proposed system works with two levels of security, one of them is face recognition of the owner with the priorly given photo of owner during his registration in the bank.They should pass the face recognition test after which they will enter the second level of authentication where the user has to set the handles to a unique angle key(which is similar that of an ATM Pin) which is provided to them. Our system recognizes the face of the person who visits the bank for access to the locker, by using haar classifier, edge, and line detections features and the faces available in the database and activates the access to the only respective locker. One can noticeably understand that when it is said the person is given access to the locker that means all the other lockers stay deactivated for the access and any trail to open them, triggers the alarm. When the person reaches locker there is a second stage of security, where the person has to open the locker by rotating the handles to a certain angle.This action needs care and can be done perfectly by the owner alone. So, this system could effectively enhance the security of the lockers
    Keywords: Material strength; Haar cascade features,edge and linerndetections,rotating handle locker.

  • Simulation and Practical Implementation under Different Scenarios of Indirect Incremental Conductance Algorithm for MPPT of PV System.   Order a copy of this article
    by Noureddine Bouarroudj, Amor Fezzani, Boualam Benlahbib, Bachir Batoun, Said Drid, Djamel Boukhetala 
    Abstract: Simplicity and good tracking performance have made the incremental conductance (INC) algorithm for maximum power point tracking (MPPT) of photovoltaic (PV) systems the most widely used algorithm. This paper treats of simulation and practical implementation of the indirect INC algorithm with a conventional proportional integral (PI) controller under different scenarios. Firstly, a comparison between indirect INC algorithm and the direct one is carried out, and in which the indirect INC algorithm is shown to be superior. Secondly, a simulation using Matlab/Similink program is conducted under standard climatic conditions, immediate change of irradiance and under immediate change of resistive load value. Finally, the validated indirect INC algorithm is implemented using a real prototype under the same scenarios as in the simulation.
    Keywords: PV-module; Boost converter; Direct INC algorithm; Indirect INC algorithm.
    DOI: 10.1504/IJAIP.2019.10021027
  • Signless Laplacian Energy of Bipolar Fuzzy Graphs with Application   Order a copy of this article
    by Hossein Rashmanlou, Muhammad Akram, Danish Saleema 
    Abstract: This paper presents certain notions, including Laplacian energy of bipolar fuzzy graphs(BFGs,rnfor short), signless Laplacian energy of BFGs, Laplacian energy of bipolar fuzzy digraphs(BFDGsrnfor short) and signless Laplacian energy of (BFDGs). Further, it describes useful propertiesrnand bounds of Laplacian energy of BFGs and signless Laplacian energy of BFGs. Moreover,rnthis article discusses an application of proposed concepts in decision-making.
    Keywords: Laplacian energy of bipolar fuzzy graphs; signless Laplacian energy; decisionmaking.

  • A Low Quality Medical Imaging Registration Technique for Indian Telemedicine Environment   Order a copy of this article
    by Syed Thouheed Ahmed, Sandhya.M Sandhya.M, Sharmila Sankar 
    Abstract: Telemedicine is growing in India and Indian environment needs to be improved for acquiring and transmitting datasets for consultation and diagnosis. These attributes are correlated with internal image quality enhancement. In this paper, a medical imaging registration and re-verification technique is proposed for low quality datasets transmitted in under-rated transmission channel. The registration approach is integrated with multiple samples of acquired datasets, sequentially processed. Thus improving the mapping, transformation time and peak signal to noise ratio. The re-verification process assures double authentication for registered image on comparison with referenced sample. The approach is tested on open medical data samples of UCL repository transmitted under low line bandwidth of Indian transmission channel and Internet standards. The proposed approach serves as a better means for diagnosis and feature extraction for tele-diagnosis and consultation in Indian rural telemedicine environment.
    Keywords: Image registration; India Telemedicine; Medical Image Processing.

  • An inventive and Innovative Integrated home network media system A novel approach   Order a copy of this article
    by Aswin Tekur, K.V. Shriram 
    Abstract: A common problem in TV viewing is watching programs on a single TV set because only one program can be watched at a time. Streaming is continuous and real-time, causing interruptions and clash among viewers (who wish to view different channels at the same time) and ultimately ruin the viewing experience. Our objective is to present a hardware unit (that is compatible with a smart TV set) to accommodate a range of features, such as: allowing simultaneous transmission of video from multiple channels (to network connected devices over a Limited Area Network), with pause/ play options, recording for certain time duration (made possible using memory buffers). These above functionalities require modification in the TV architecture of currently available systems, to support these functionalities. One component of our system (the transmission unit) can either be an additional hardware component in the TV set, or in the design of a portable flash drive (functioning similar to that of a dongle), for purposes of convenience and portability. The proposed system is aimed at making huge strides in the field of Television viewership and offers meaningful convenience features to enrich viewers experience, with minimal modification to presently used Television systems and make more utilization of network connected devices (such as laptops, tablets, smartphones). By eliminating need for multiple TV connections and using our system instead, one can save money, time and obtain many new features that are not offered by currently available Television sets.
    Keywords: Simultaneous video transmission; Advanced TV system; TV system over Limited Area Network; Home environment Television system; Multi viewer multi program TV system;.

  • Bi-Directional Sensor Placement for K-Coverage Solution   Order a copy of this article
    Abstract: In wireless sensor networks (WSNs), to provide each point at least K-coverage is called K-coverage solution. To ensure the effective use of resources, omni-directional sensors have normally been deployed in the region of interest (ROI). However, due to its poor energy consumption, some of the locations are more adaptive and advantageous for the bi-directional sensors. We are proposing the solution for k-coverage by using bi-directional sensors. The objective of the work is to provide a coverage solution in a sensor network by reducing the energy consumption. We are proposing the method by using directional sensor so that data transmission in all direction can reduce. In this way, we are able to overcome the major disadvantage of unidirectional sensors. This work is addressing the issue of how to make improvement in coverage using a directional sensor model. Further, the measures of directional sensors for a given coverage rate has been estimated. The coverage probability of the region of interest (ROI) for N directional sensors is being evaluated which is further used to recognize the ratio of bi-directional to omni-directional sensors. Moreover, we are providing the linear formulation for coverage which further relaxing and solved using a heuristics approach (sensor position manipulation manually). The proposal given in this paper is able to provide the K-coverage with utilizing the classified part of the sensors as proposed in the system model, optimizing the resources and following the WSNs application constraint. In our simulation, the effect of offset angle, radius for multiple rounds have been estimated and shown in a result which reflects the promising improvement over the existing proposal.
    Keywords: Wireless Sensor Network; Omnidirectional sensor; Bi-Directional Sensor; Coverage; K-Coverage.
    DOI: 10.1504/IJAIP.2019.10031795
  • Domination and Product Domination in Intuitionistic Fuzzy Soft Graphs   Order a copy of this article
    by R. Jahir Hussain, S. Satham Hussain, Sankar Sahoo, Madhumangal Pal, Anita Pal 
    Abstract: This manuscript deals with the domination and product domination of intuitionistic fuzzy soft graphs. By using the concept of strength of a path, strength of connectedness and strong arc, the domination set is established. The necessary and suficient condition for the minimum domination set of intuitionistic fuzzy soft graph is investigated. Further some properties of domination number of product intuitionistic fuzzy soft graphs are also obtained and the proposed concepts are described with suitable examples. The weight for a domination of intuitionistic fuzzy soft graph is also established.
    Keywords: Intuitionistic fuzzy graphs; Fuzzy soft graphs; Product domination; Strength of connectedness.
    DOI: 10.1504/IJAIP.2019.10022975
  • Novel Deep Learning Model with Fusion of Multiple Pipelines for Stock Market Prediction   Order a copy of this article
    by Abhishek Verma 
    Abstract: Deep learning has become a powerful tool in modeling complex relationships in data. Convolutional neural networks constitute the backbone of modern machine intelligence applications, while long short-term memory layers (LSTM) have been widely applied towards problems involving sequential data, such as text classification and temporal data. By combining the power of multiple pipelines of CNN in extracting features from data and LSTM in analyzing sequential data, we have produced a novel model with improved performance in stock market prediction by 20% upon single pipeline model and by five times upon support vector regressor model.rnWe also present multiple variations of our model to show how we have increased accuracy while minimizing the effects of overfitting. Specifically, we show how changes in the parameters of our model affect its scores for training and testing, and compare the performance of a multiple pipelines model using three different kernel sizes versus a single pipeline model.rn
    Keywords: Stock prediction; S&P500; CNN; LSTM; Deep learning.

  • Investigation of Binding Update Schemes in Next Generation Internet Protocol Mobility   Order a copy of this article
    by Mathi Senthilkumar 
    Abstract: IPv6 development and deployment have opened up several concerns in reference to its transition from IPv4 to IPv6 and its practical challenges. Mobility support is one of the significant features in IPv6 that has improved in every efficient manner than IPv4. Mobile IPv6 allows mobile devices to be connected to the correspondent node even when it moves to the other network. The current location of the mobile node is informed to the home agent and correspondent node through binding update schemes. With the aim of experiencing this improvised network and to perform IPv6 related research, the binding update schemes of mobile IPv6 are explored in this paper. The paper emphasizes the number of messages exchanged between the communicants during the location update of mobile devices in the various binding update schemes in mobile IPv6. In addition, it discusses the attacks that are involved in the binding update messages of the existing methods. Finally, the paper focuses on the comparative analysis of the various binding update schemes based on the number of messages communicated between nodes.
    Keywords: Mobile IPv6; Home agent; Binding update; Route optimization; Correspondent node.

  • Predicting Elections Over A Twitter: A Campaign Strategies Of Political Parties Using Machine Learning Algorithms   Order a copy of this article
    by G. Anuradha, Nageswara Rao Moparthi, Sridhar Namballa 
    Abstract: Twitter is a social networking web application built to find out what is happening around us and all over the world. For this study, we analyzed one million tweets for applying Sentiment analysis is helpful to analyze the information where opinions are either good or bad/negative, or neutral and highly structured, heterogeneous in some cases. In this article we used few Machine learning algorithms like Na
    Keywords: Twitter; Sentiment; Naïve Bayes; SVM; DT; Neural Network.
    DOI: 10.1504/IJAIP.2019.10021029
  • A Key Pre-distribution Protocol for Node to Node and Group Communication in Wireless Sensor Networks using Key Pool Matrix   Order a copy of this article
    Abstract: Sensor networks have huge demand in various fields like military, environment monitoring, hospitals and many hostile environments. Further, they are also used in application of internet of things where many number of sensors is connected through internet. These applications demand security issues like confidentiality, authentication, integrity, because of their deployment areas and sensitivity of the data. By considering these issues, the key management plays an important role in many information security solutions which are used information protection. The proposed work exploits the various vulnerabilities in the sensor network and addresses various kinds of solutions for vulnerabilities through proposed key distribution scheme. The key generation and distribution implemented using key pool matrix. The comparison and analytical analysis are shown that the proposed work requires less communication and storage space at each sensor. Further, the prosed work can also increase resilience, reduce key compromise and number of revocation operations compared with other schemes.
    Keywords: Key pre-distribution in wireless sensor networks; Attacks; Node capture.

  • New Features for Language Recognition From Speech Signal   Order a copy of this article
    by M. Sadanandam, V. Kamaskhiprasad 
    Abstract: In this paper, we derive new feature vectors for identifying the language from short utterance of speech of an unknown person. By applying window technique on speech signal , Mel cepstral coefficients (MFCC) and formants of speech are extracted. With these two kinds of features, we derive a new feature set using cluster based computation. Later a classifier is designed one for each language using the new features vectors and applied on recognition output with a specific apriori knowledge. We use OGI database to perform the experiments and achieved good recognition performance.
    Keywords: Format frequencies; Language Identification; MFCC; LID; Minimum phase group delay; LID using new feature set.
    DOI: 10.1504/IJAIP.2019.10023086
  • Knowledge Mining from Project Retrospect for Organizational Learning in the Responsive Software Engineering Areas   Order a copy of this article
    by Manikanta Reddy 
    Abstract: The space of Knowledge Administration(KA) offers broadened set of practices that recognizes, gather, store and offer the bits of knowledge and encounters people and associations at work. Contemporary situations show that elegant philosophies are sent broadly by result driven associations to convey working programming at a speedier rate. Obviously the continual detachment of human capital has incited these associations to convey more proactive instruments for information catching, lessons learnt and best practices acquired amid past procedures. On the off chance that appropriately put away, these flexible reviews gives a rich wellspring of learning for overseeing repeating and routine issues on one hand and managing existing programming improvement process on the other. The proposed work plans to portray a system for catching implied information created through the procedure of programming improvement in different structures like lessons learnt, specialists knowledge and gatherings. A lexical asset ASEWordNet specific for nimble programming building area has been produced. This paper considers applying supposition mining and data recovery systems with SENTIWORDNET and ASEWordNet vocabulary for testing the informational collections to order the viewpoints of the caught lessons containing such assessments and verbatim in it.
    Keywords: Knowledge Management; Responsive programming enhancement; Opinion mining; Retrospectives.

  • A novel approach for assessing the damaged region in MRI through improvised GA and SGO   Order a copy of this article
    Abstract: A plethora of Magnetic Resonance (MR) image segmentation methods exist in the published literature but most of them fail at recognizing small regions in MR images accurately due to inefficient segmentation techniques. Through this article, we propose a novel and efficient MRI image segmentation technique which employs an improvised Genetic Algorithm (GA) based on twin point cross over mutation for automated segmentation. The resultant image from GA is used as an input for Social Group Optimization technique (SGO), and a lightweight computationally efficient algorithm for refining the segmented image. We have carried out an experiment on benchmark and real time images to compare the proposed technique with the existing segmentation methods which use Teacher Learner based optimization (TLBO). We have observed that proposed approach exhibits better performance over its counterpart.
    Keywords: Harmonic Mean; Genetic algorithm; Social Group Optimization; Laplacian; Magnetic Resonance Imaging.

  • Prediction of Airfoil Self-Noise using Polynomial Regression, Multivariate Adaptive Regression Splines, Gradient Boosting Technique and Deep Learning Technique   Order a copy of this article
    by Sanjiban Sekhar Roy, Paridhi Singh, Gobind Manuja, Raghav Sikaria, Maharishi Parekh 
    Abstract: In the 21st century, human life is advancing at an immeasurable pace and has become incoherent with the pace with which our mother Earth could adjust itself. This has drastically resulted in depletion of resources and wastage of energy. For saving resources, we are trying to find a perpetual resource and for saving energy, we are trying to build more efficient systems and machines. Hence, researchers are trying to reduce wastage of energy wherever possible. One such cause of wastage of energy is the generation of airfoil acoustic noise and attempts by scientists and researchers to minimise this noise dates back to as early as 1989. Noise plays a significant role in the design of automobiles, aircraft, turbines, etc. There have been various studies recently, regarding airfoil self-noise, its generation, its prediction and how to curb the noise and the various ill-effects of it. Estimation of noise needs to be accurate so that the further studies to reduce noise from the airfoil models can be performed efficiently. Thus, the development of a coherent noise prediction tool is vital. Hence, through this paper, we try to estimate the best of such noise prediction tools by discussing and comparing certain regression models. We have divided our dataset into training and testing components and the results have been illustrated using tables and graphs. It is observed that Multivariate Adaptive Regression Splines and Polynomial (MARS) regression models have shown reasonable output whereas outstanding results have been obtained by applying Deep Neural Networks and Ensemble learning method, called Gradient Boosting Method, for the airfoil self-noise prediction problem.
    Keywords: Airfoil acoustic noise; Prediction; Multivariate Adaptive Regression Splines; Deep Neural Network; Polynomial Regression; Gradient Boosting.

  • Technical review on ontology merging   Order a copy of this article
    by Kaladevi Ramar 
    Abstract: Emergence of semantic web made available and accessible of many ontologies through web. It automatically increases the usage and application of ontologies. A single domain ontology is not sufficient to support the requirements anticipated by a distributed scenarios. More ontologies need to be utilized from other applications. These requirements motivate the integration of similar ontologies. However, the main issue is that there is no unique optimum solution for a merging methods and that merging can be achieved asymmetrically or symmetrically. In this paper, various merging techniques are analyzed with its pros and cons. From that, the issues in the existing methodsare listed and the possible research directions for future enhancement are discussed. Hope this can open a way to improve merging solutions as per the application requirements.
    Keywords: ontologies; merging; sematic web; heteogeneity.

  • A Fast R-CNN based novel and improved object recognition technique.   Order a copy of this article
    by Shriram K Vasudevan, Aswath GI, Sargunan Ramaswamy, Vimal Kumar K 
    Abstract: Humans have the natural power to identify objects easily on their own, but a machine cannot. An algorithmic description of recognition task has to be implemented on machines to identify an object in an image. A very challenging and a tough task in the computer vision is to detect objects and to estimate the pose. Object recognition remains to be one such key feature in computer vision technology and it is used for identifying a specific object in a digital image or video. The importance of Object recognition algorithm is very high in real-world applications. Object detection is much more complex and challenging compared to image classification. Some of the applications include Biometric recognition, industrial inspection, Robotics, Intelligent Vehicle System, Human-computer interaction, etc. In a retail business, identifying the products of a single manufacturer is difficult. This research uses, Fast R-CNN Algorithm to detect the products of a particular manufacturer - [Procter & Gamble].
    Keywords: Object Recognition; Deep Learning; Machine Learning; CNN; R-CNN; Fast R-CNN;.

  • L(2,1) and Surjective L(2,1) Labeling of Cartesian Product Between Two Complete Bipartite Graphs   Order a copy of this article
    by Sumonta Ghosh, Anita Pal 
    Abstract: As we expect effective and efficient communication over a complex network, we consider the graph G=K_{m,n} X K_{p,q} to label with L(2,1) labeling and investigate the bound lambda_{2,1}(G) in terms of m,n. We also raise few demerits of L(2,1) labeling and introduced surjective L(2,1) labeling as remedy. Surjective L(2,1) labeling follow the restriction of L(2,1) labeling where all the labels are unique and belongs to cardinality of the vertices in the graph. We also apply surjective L(2,1) labeling on the graph G=K_{m,n} X K_{p,q}. In this paper we designed three different algorithms to incorporate above labeling and also analyzed time complexity of the algorithms.
    Keywords: Cartesian product; L(2,1) labeling; surjective L(2,1) labeling; complete bipartite graph.

  • Median relative intersection of confidence intervals for bandwidth estimation in mean shift clustering technique   Order a copy of this article
    by Prasad Kaviti, Valli Kumari Valli Kumari 
    Abstract: Mean shift algorithm is a non-parametric iterative algorithm widely used in segmentation, clustering, object tracking, etc. However, tuning the bandwidth parameter and selection of kernel with its convergence is required. This paper proposes a modified mean shift in terms of bandwidth selection and its adequate kernel selection. Mean shift equipped with median relative intersection of confidence intervals (MRICI) for multispectral image clustering is proposed. Initially different kinds of bandwidth estimators like static, Silverman, Scott, ICI and MRICI are evaluated and are considered four classes of kernels Gaussian, Epaenchnikov, flat, biweight with general convergence. Later different combinations of the four classes of kernels and different bandwidth estimators of mean shift are evaluated. Results show an improvement in intracluster similarity based on silhouette measure for MRICI bandwidth estimation using the Gaussian kernel of mean shift when compared to other combinations of mean shifts.
    Keywords: mean shift clustering; kernels; bandwidth; confidence intervals; multispectral images.

    by Venkata Nagendra 
    Abstract: Gradient boosting algorithm [1] was produced for high prescient capacity. Its selection was restricted to minimize errors for the previous trees; only one decision tree was created. To build small size models it takes large amount of time. To overcome these drawbacks eXtreme Gradient Boosting (XGBoost) [2] was developed. It decreases the model building time as well as increases the performance. The experimental results demonstrate that EGB (Enhanced Gradient Boosting) algorithm perform better than the remaining algorithms like XGB, Gradient Boosting (GB) etc in the context of class imbalanced dataset. The EGB algorithm works as same as XGB and also works on Balanced data with high accuracy. EGB works well on both Balanced and Imbalanced data. The results obtained show that the Area under Curve obtained through EGB is higher than the Area Under curve obtained through XGB.
    Keywords: Machine Learning; Boosting; Gradient Boosting; Enhanced Gradient Boosting; eXtreme Gradient Boosting (XGBoost); Multithreading.

  • An Efficient Approach for Storage of Court Judgments Using Graph Database   Order a copy of this article
    by Varsha Mittal, Durgaprasad Gangodkar, Bhasker Pant 
    Abstract: Extracting relevant information from a large dataset has always been challenging and predicting is even more. One such issue comes with arranging large volume of court cases into meaningful data. Text mining approach has its own limitations when it comes to large volume of data having similar type of information. The authors have taken a challenging task of utilizing capabilities of different technologies like Hadoop and Graph Database to come out with an integrated solution to store volumes of court verdicts into a scalable and well organized format. This paper aims at developing an e-dictionary for the courts using a simplified and case wise keyword dictionary creation that could be used to understand the judgments pronounced in earlier similar cases. To overcome the challenge of breaking the text into tokens and get the word count without compromising the speed, a single cluster on Hadoop is considered and finally Graph Database to store the data. The model is guided by various graph and set theories and is an effort to answer the queries of judges and lawyers in an effective manner by reducing the effort of reading the whole cases and referring only to the final judgment. Use of graph theory provides the logical view of the stored data .To check the robustness of the proposed model precision, recall, and accuracy were calculated and they were found to be 1.0, 0.92-and 92.66 percent respectively.
    Keywords: Hadoop; graph database; text mining; dictionary; map-reduce.
    DOI: 10.1504/IJAIP.2019.10027646
  • Handwritten North Indian Script Recognition Using Machine Learning: A Survey   Order a copy of this article
    by Reya Sharma, BAIJ NATH KAUSHIK, Naveen Gondhi 
    Abstract: The handwritten script recognition is an interesting and significant area of research due to the existence of wide variety of challenges in handwritten Indian scripts. Intensive research work is available on the recognition of scripts like Chinese, Roman, Arabic and Japanese. But the research work done on Indian scripts is still at its infancy, therefore in this paper a review has been presented on the recognition of various handwritten North Indian scripts. Variety of techniques associated with feature extraction and classification of handwritten North Indian scripts are precisely discussed in this work. An attempt has been made with this survey to address and highlight significant results obtained so far in this field and these results are represented in tabular form so as to provide a clear idea by looking the data at once. This survey also provides beneficial future directions for research in handwritten North Indian scripts by analysing the existing difficulties and steps needed for the development of North Indian scripts OCR.
    Keywords: Handwritten Character Recognition; OCR; Devanagari; Gurmukhi.

  • Adaptive Hybrid Transmit Power Control (AHTPC) Algorithm for Wireless Body Area (WBAN) Networks   Order a copy of this article
    by Rajkumar Rajkumar, Samundiswary Samundiswary 
    Abstract: Energy-efficient transmission is considered as a key solution for WBAN which can facilitate long-term and low-power operations. There are several Transmit Power Control (TPC) techniques available in literature, related to WBANs. A TPC technique can be proactive or reactive depending on the channel conditions. While reactive approaches involve control packet overhead and additional delay, the proactive approaches are prone to prediction errors and involve prediction delay. Hence, a hybrid technique is needed which combines the advantages of both proactive and reactive techniques for all type of channel conditions. In this paper, an Adaptive Hybrid TPC (AHTPC) algorithm for WBAN has been developed. In AHTPC, the base station (BS) measures the Received Signal Strength Indicator (RSSI) and Packet Delivery Ratio (PDR) values within an adaptive timer period and stores them in a channel sample matrix. If the values of RSSI and PDR fall outside the range of some lower and upper bounds, the Reactive Transmission Power Control algorithm (RTPC) is executed. If the difference of consecutive RSSI samples becomes larger and the channel condition is considered as fluctuating one, then the Proactive Transmission Power Control algorithm (PTPC) is executed. From the simulation results, it is shown that AHTPC algorithm has reduced energy consumption, less delay, less control packet overhead and increased delivery ratio, when compared to proactive and reactive techniques.
    Keywords: Wireless Body Area Networks (WBAN); Power Control; Adaptive; Hybrid; Channel condition.

  • eeFFA/DE- A Fuzzy Based Clustering Algorithm using Hybrid Technique for Wireless Sensor Networks   Order a copy of this article
    by Richa Sharma, Vasudha Vashisht, Umang Singh 
    Abstract: Designing an energy aware clustering algorithm for Wireless Sensor Networks (WSNs) has become an issue of great concern among the scientific community in these days. This is due to the non-rechargeable nature of the battery operated sensor devices, which are considered as the main building blocks of these wireless networks. Clustering the sensor nodes into disjoint groups has proven to be a best energy saving approach. This paper suggested a fuzzy based clustering algorithm named eeFFA/DE to achieve energy efficiency for WSNs. Proposed algorithm eeFFA/DE comprises of two phases. First phase focuses on the clustering of the nodes by using a distributed approach named Balanced Clustering Algorithm with Distributed Self organization (DSBCA). The second phase critically analyzes and select cluster heads by using two metaheuristic approaches, firefly algorithm and differential evolution technique. In this attempt, each individual node fitness value is evaluated. Proposed algorithm also emphasizes on fault tolerance for selecting sub cluster head selection. Experimental results validate the efficiency of the eeFFA/DE algorithm by using metrics like dead nodes per round, network throughput and residual energy of the nodes per round.rnrn
    Keywords: clustering; firefly algorithm; network lifetime; network throughput; residual energy.
    DOI: 10.1504/IJAIP.2019.10025734
  • An Empirical analysis of software maintainability metrics: Object-Oriented Approach versus Traditional   Order a copy of this article
    by Yenduri Gokul 
    Abstract: Software is a great blend of creativity and engineering, which plays a major role in different fields. Software is pre dominantly developed using Object-Oriented approach .Software quality is foremost of all because it has a vast influence on software development life cycle (SDLC). There are many factors influencing quality where maintenance is most important of them. Maintainability of software can be measured using different metrics .In recent times Object-Oriented (OO) approach has become salient in building scientific and business applications but structural approach has its intensification in embedded applications .It is significant to find impact of metrics on each other when different programming languages are considered because they play an significant responsibility in predicting software maintainability. This research empirically analyzed the dependency of various metrics values obtained from software which are similar in both structured (C) and Object Oriented programming (Java) using CCCC and HM tool. Further, the relationships between structured and Object Oriented programming is found out by comparing the different techniques such as data visualization, correlation in terms of maintainability.
    Keywords: Software Quality; Metrics; SDLC; Maintainability.

    by Kishore Kumar Kumar, Sivachandran P, Suganyadevi MV 
    Abstract: Solar energy is one of the best growing renewable energy sources across the world. Solar energy offers various advantages such as pollution free, quite in operation, long life, nil input energy cost and less maintenance. The individuality of solar cell is dependent on the environmental parameter mainly with sun irradiance and temperature. In order to minimize the system cost and to maximize the array efficiency the new fuzzy logic control methodology is implemented in DC-DC Boost Converter for extracting maximum power point under various environmental conditions. The new improved fuzzy logic based MPPT is created and compared with conventional incremental conductance (INC) method with various temperature and irradiance condition. The circuit simulations are checked out in MATLAB/SIMULINK Software
    Keywords: Fuzzy Logic; MPPT; DC-DC Boost Converter.

  • A complete analysis of Integrated Vehicle Health Management for Aircraft With Pros, Cons, Suggestions for improvement and Future Prospects.   Order a copy of this article
    by Vimalkumar K, K.V. Shriram 
    Abstract: Integrated vehicle health management is a concept which comprises the integration of sensors, communication technologies, artificial intelligence, data analytics and software health management to facilitate vehicle-wide abilities for diagnosing problems and recommending solution. The IVHM uses sensors to monitor the condition/health of the vehicle by analyzing the data readings from the installed sensors in the vehicle. The Aircraft is such a vehicle which needs to be monitored continuously for the flawless and continuous functioning. The data collected from the sensors installed in the aircraft helps to analyze the present and predict the future performance of the aircraft. Also, the data can also be used to make operational decisions, which are very critical for real-time performance. This paper provides the state-of-the-art report of the IVHM concept with an organized review of the previous research works. The articles are collected from different sources and are analyzed, and the summary of the major works are reported. The paper gives the underlying concept of IVHM and its roadmap, use of IVHM in aircraft, existing approaches and barriers in adopting it for the aircraft, available techniques and future research directions. Overall, this paper shall present the reader with what IVHM is, state of the art and future prospects.
    Keywords: Integrated Vehicle Health Management; IVHM for Aircraft; Prognostics; Prediction; Aircraft safety; Aircraft health monitoring;.

  • SVM-based Multiple Instance Learning Approach to Select the Best Answer in CQA Sites   Order a copy of this article
    by Tirath Prasad Sahu, Naresh Nagwani, Shrish Verma 
    Abstract: A community question answering (CQA) site is an online platform where a user posts a query and receive the best answer among multiple answers posted by others. In most of the CQA site, the best answer is selected manually among multiple answers for a particular question. In the manual process, answers are voted against a question and an answer with highest votes is generally selected as the best answer. Since CQA sites are engaged in receiving the questions frequently, it becomes tedious for the asker or community to select the best answer for every posted question. This paper proposes a support vector machine (SVM)-based multiple-instance learning (MIL) technique for the selection of the best answer among all answers posted for of a particular question in a CQA site. The MIL aims to learn answers (multiple-instance) of a question (a bag) using SVM. The prediction of best answer for the question is derived from the maximum instance margin problem of MIL in supervised classification. It is shown that performance parameters ROC-AUC, PRC-AUC, and G-mean for the proposed model are significantly better than the existing traditional model in prediction of the best answers.
    Keywords: Support vector machine; Community question answering; multiple instance learning; classification; topic modelling; activeness; expertise.

  • Who Will be My Dearest One? An Expert Decision   Order a copy of this article
    Abstract: Recommendation systems assist in finding the right things to the right users. The counterpart matchmaking recommendation system connects on users, enhances the social relationship, saves time, minimizes the risks involved in offline suggestions, and encourages collaboration. Matching implies the ability to recommend potential partners for target users. It matches the profiles based on the preferences provided by the users. In the present era, people are highly involved with their busy schedule. Thus smart recommendation system will be highly demanding service for natives in smart cities. Smart cities, a fully planned and digitized city where people like to adopt online services. Demographic Filtering is the widely used technique for matchmaking recommendation systems. In this research, a novel demographic filtering-based matchmaking framework that precisely identifies the users profiles to provide the top-n recommendations is proposed. The matchmaking is accomplished using the K-means and Ant colony hybrid. Support vector regression is also employed to enhance the performance and make the decision more precise and realistic.
    Keywords: Ant Colony Optimization; Demographic Filtering; K-Means Clustering; Recommendation Systems; Support Vector Regression.

  • A Hybrid DNA Cryptography Based Encryption Algorithm Using the Fusion of Symmetric Key Techniques   Order a copy of this article
    by Animesh Hazra, Soumya Ghosh, Sampad Jash 
    Abstract: The twenty first century, which is the era of e-commerce and e-business where information security plays very vital role. The higher rate of data breaching has raised a query to the safety of digitalization scheme. The security approaches which are made for every industry or organization however big or small should be flexible with the ever changing and challenging data breaching environment. Therefore, the need of encryption and user access controls to safeguard the data has been introduced. This has given birth to various hybrid cryptographic methods, one of them is the DNA cryptography. In this study a brief sketch of DNA cryptology coupled with an innovative algorithm which is founded on the amalgamation of DNA nucleotides, XOR operation and symmetric key concept is presented. The algorithm presented here is truly efficient and one of the salient feature of this technique is the concern of safety that can be customized according to the necessity of sender.
    Keywords: Complement Operation; Data Security; Decryption; Digital Coding; DNA Cryptography; Encryption; Fusion; Substitution; Symmetric; XOR operation.

  • Implementation and comparison of the Image Edge Detection Techniques with complete analysis and suggestions for improvement   Order a copy of this article
    by Parvin N, Kavitha P 
    Abstract: Nowadays, Image processing is one of the most popular technologies. It remains the core area of research in engineering, especially in computer science discipline. Image processing is done either as analog method or digital form. Analog image processing is widely used in the places where we want to use physical form of data, like printouts or photographs. Digital image processing, on the other hand helps in manipulating the digital images, with the help of algorithms and processes. Digital images undergo different stages, namely pre-processing, enhancement and information extraction. Edge detection is a technique generally used in image processing for identifying boundary of objects within an image. It can be a great pre-processing step for image segmentation. There are several types of algorithm to detect the edges. In this paper, analysis is done on three edge detection techniques namely, Prewitt, Roberts and Sobel. It is experimentally observed that Sobel is giving the best result than the other two. This work is implemented on MATLAB R2016a.
    Keywords: digital image processing; image segmentation; Edge detection; Prewitt; Sobel; Roberts;.

  • Template Based Approach for Question Systematization   Order a copy of this article
    by Urmila Shrawankar, Komal Pawar 
    Abstract: Main reason behind questioning is to gather information needed or seek explanation about certain topic. But correct information can be gathered only with a specific error free question. Various applications pursuit for error free standard question. The issue of Statement construction is more concentrated than Question construction. This project work particularly concentrates on error free Question construction using text systematization. Template based approach is used for carrying out this process. Question Template is the basic idea behind Template based approach. Templates are manually designed through coding. This is accompanied by Dictionary approach and powerful Natural Language Processing technique like POS Tagging. This technique follows Maximum Entropy based algorithm. Different error parameters are considered for the correction. This work focuses domain specific WH-type questions of English along with imperative questions. This work has different applications namely, to set exam question papers, to help English learners to study interrogative constructs properly, to produce intermediate output for complex systems like question-answering system.
    Keywords: POS tagger; question templates; systematization; template based approach; WH-questions.
    DOI: 10.1504/IJAIP.2019.10027084
  • Hybrid Probabilistic Triple Encryption (HPRRA) Approach for Data Security in Cloud Computing   Order a copy of this article
    by Vartika Kulshrestha, Seema Verma, C. Rama Krishna 
    Abstract: Cloud storage is a popular paradigm with dynamic capabilities in which statistics are maintained, managed, backed up remotely. It allows consumers as well as corporations over a web network to use the service as per consumers demand and requirement and increases the competences of the hardware resources by optimal and shared utilization. However, cloud information security and privacy have become a crucial issue that affects the triumph of cloud computing. First and foremost, stockpiling of information at cloud builds the risk of information leak and illegal access. Second, cloud servers are turning into the objectives of assaults and interruptions which challenge cloud security. Third, information management tasks, like, information storage, reinforcement, migration, removal, update, exploration, query and admittance in the cloud may not be completely trusted by its proprietors. Therefore, to enhance the security, a hybrid security technique for cloud computing using probabilistic approach has been proposed by using RSA and AES cryptosystem. The proposed framework emphases on encryption and decryption method, facilitating the cloud consumer with information security reassurance. Our elucidation is built on hybrid probabilistic triple encryption approach and this methodology is used to encode-decode the information before porting on a cloud with a hash value of each and delivers the data integrity, confidentiality and user authenticity. In this paper, a review of different privacy & security issues is also discussed.
    Keywords: Cloud computing; security; RSA; AES; authenticity.
    DOI: 10.1504/IJAIP.2021.10036005
  • Energy Efficient Scheduling of Scientific Workflows in Cloud with improved Makespan using Hybrid of Genetic and Max-Min Algorithms   Order a copy of this article
    by S. Balamurugan , S. Saraswathi  
    Abstract: Recently, there is an increased usage of cloud and its resources to deploy and run complex scientific applications. The scientific applications involve large number of dependent tasks and huge data, and only cloud environment can provide a stable platform with its resource provisioning mechanism and scalability. Scheduling a scientific workflow in a cloud environment today faces many issues like high energy consumption, increased makespan due to inefficient scheduling during the execution. In this paper, we propose anenergy efficient based scheduling algorithm to reduce the total energy consumption of cloud virtual machines and also improve the makespan of a scientific workflow. The results proved that the new algorithm reduces the total energy consumption and also reduces the makespan.
    Keywords: Scientific Workflow; Workflow Scheduling; Virtual Machines; Energy Efficiency; Power Utilization; Task assignment; Task migration; makespan; Genetic Algorithm; Max-min Algorithm.

  • Self Organized Map and trust-aware-based quality of service prediction for reliable services selection in distributed computing environment.   Order a copy of this article
    by Youcef Ould-Yahia, Meziane Yacoub, Samia Bouzefrane, Hanifa Boucheneb 
    Abstract: The distributed computing environment allows to provide the outsourced computing services in addition to web-services for IoT and mobile technologies. An emerging research topic is the QoS and security indicator prediction to achieve a reliable service selection that meets user requirements. Collaborative filtering technique is one of the most widely used model in service selection. It is based on similarity computation between users or services. But the main drawback of this method is the lack of data to compute an effective similarity value. Furthermore, malicious users give false feedback which influences the accuracy of prediction. In this work, we propose a novel similarity evaluation model based on Self Organisation Map to address the problem of data lack and a robust index computation to detect the untrustworthy users. The proposed approach uses a K-means based average evaluation to determine the tenderness of the data and an off-line build-up model to increase computational efficiency.
    Keywords: Distributed computing; Web-services; QoS prediction; Trust-aware; Internet of Things; Mobile-edge computing; Self-organizing map.

  • Some Properties of Bipolar Complex Neutrosophic Graph   Order a copy of this article
    by Hossein Rashmanlou, Muhammad Shoaib, M.A. Malik, Yahya Talebi, Ali Asghar Talebi 
    Abstract: A neutrosophic graph has many uses in different areas of bio and physics which gives a direction about uncertainty information. The complex neutrosophic graph is the extension of the complex fuzzy graph. In these years, a mathematical approach is a generalized approach of blending different aspects. According to the above mathematical approach, we introduce a strong technique which isrnbipolar complex neutrosophic fuzzy sets and graph theory and introduce the notion of bipolar complex neutrosophic graphs. We will prove that a bipolar complex neutrosophic graph is a generalizationrnof the complex neutrosophic graph. A bipolar complex neutrosophic graph has more exibility andrncompatibility compare to the complex neutrosophic graph. In this paper, our important aim of thernstudy is to apply a few properties namely cartesian products, composition, strong product, semi-strong product and direct product of bipolar complex neutrosophic graph (BCn 􀀀graph in shortly) which have been deeply discussion with different examples and their proofs.
    Keywords: Properties cartesion products; composition; strong product and semi strong product andrndirect product of bipolar complex neutrosophic graph.

  • Analysis of Performance and Trust in Load Balancing Algorithm on Cloud Computing Environment   Order a copy of this article
    by Ajay Kumar Dubey, Vimal Mishra 
    Abstract: Abstract: Load balancing is a technique to improve the performance of virtual machines in cloud environment. Load balancing algorithm tries to divide the total workload fairly among all the nodes, to ensure the optimum use of available resources. There exists many load balancing techniques in cloud computing environment. Many researchers have been suggested different techniques to solve the load balancing problem. This research paper is a simulated analysis of various existing load balancing algorithms in cloud computing environment. In this simulated analysis, various load balancing metric parameters were used to evaluate the performance and trust of existing load balancing algorithms. Ultimate aim of this paper is to help in design of new algorithms by analysing the merits and demerits of various existing algorithms.
    Keywords: Cloud Computing; Performance Metrics; Load Balancing; Virtual Machine; Trust.
    DOI: 10.1504/IJAIP.2021.10030101
  • Integration of GIS&WSN for an intelligent traffic signal   Order a copy of this article
    by A.S. Elmotelb, Bahaa.T. Shabana, I.M. Elhenawy 
    Abstract: In most urban areas, Traffic signals are the backbone of transportation systems. Consequently, transportation authorities have started to search for techniques that emphasis on the effective usage of the current transportation frameworks. However, these strategies are lacking the ability to avoid conflicts in the traffic environment and have low service rates, huge waiting times, high queuing length and a huge amount of pollutions. The recent advancement in Wireless Sensor Networks (WSNs) and Geographic Information System (GIS) provides new opportunities for better environmental management. Thus, this paper proposes a Traffic Signal Management Dynamic Cycle (TSMDC) control system for traffic signal management. TSMDC has been tested over a real traffic data (loaded from OpenStreetMap site). TSMDC increases mean travel time by 11.2%, decreases halting by 93.7%, decreases queuing time by 89%, reduces queuing length by 67.2% and reduces the amount of pollution by 40.6% in the traffic system compared with ordinary systems.
    Keywords: Geographic Information System (GIS); Wireless Sensor Network (WSN); traffic signals management system; Intelligent Transportation System (ITS); transportation systems.

  • A Novel Variant of Bat Algorithm Inspired from CATD-Pursuit Strategy & Its Performance Evaluations   Order a copy of this article
    by Shabnam Sharma, Sahil Verma, Kiran Jyoti 
    Abstract: This paper presents a novel nature inspired optimization technique, which is a variant of Standard Bat Algorithm. This optimization technique is inspired from the pursuit strategy of microchiroptera bats and their efficient way of adaptation according to dynamic environment. Here dynamic environment describes different movement strategies adopted by prey (target), during their pursuit. Accordingly, bats have to adopt different pursuit strategies to capture the prey (target). In this research work, a variant of Bat Algorithm is proposed considering the pursuit strategy Constant Absolute Target Detection (CATD), adopted by bats, while targeting preys moving erratically. The proposed algorithm is implemented in Matlab. Results obtained are validated in comparison to Standard Bat Algorithm on the basis of best, mean, median, worst and standard deviation. The results demonstrate that the proposed algorithm provides better exploration and avoid trapping in local optimal solution.
    Keywords: Bat Algorithm; Constant Absolute Target Detection (CATD); Computational Intelligence; Echolocation; Meta-heuristic; Nature-Inspired Intelligence; Optimization; Pursuit Strategy; Swarm Intelligence.
    DOI: 10.1504/IJAIP.2021.10030248
  • Wireless Smart Automation Using IOT Based Raspberry Pi   Order a copy of this article
    by Vasu Goel, Akash Deep, Madireddy Vivek Reddy, Yedukondala Rao Veeranki 
    Abstract: In this paper we propose a smart door lock system and lighting system for home automation. This door lock system and lighting system is controlled by Radio Frequency Identification (RFID) reader which is programmed by Raspberry Pi to detect the input swipe through our university combo card or a RFID tag and wirelessly sends the signal to the Espruino (ESP) Wi-Fi module and Node Microcontroller Unit (MCU) which in turn activates the lighting system and door lock system. The mainstream application of the system will be in hostel rooms or in our homes wherever door locks are there so that doors can be opened anytime we want without disrupting our work or getting up from our places in case of any injury with a swipe of card
    Keywords: Internet-Of-Things; Raspberry pi; Radio-Frequency Identification; Home automation; MQTT.
    DOI: 10.1504/IJAIP.2019.10026853
  • VLSI Implementation of ECG Feature Extraction: A Literature Review   Order a copy of this article
    by Surendhar S, Thirumurugan P, Ezhilmathi N, Sathesh Raaj R 
    Abstract: In this paper, we examine the comparative learning on VLSI Implementation of Electrocardiogram (ECG) feature extraction method to diagnose the different cardiac arrhythmia. ECG features extraction plays an important significant role in diagnosing most of the cardiac diseases to avoid mortality. In ECG, P-QRS-T wave generated using some novel method to find the peak amplitude and time periods. Recently different methods have been implemented in VLSI for analyzing the ECG signal by multiple researchers. Several techniques and algorithms comprise their own merits and demerits. In this paper, the various methods and techniques are discussed in literature review for cardiac analysis.
    Keywords: Area; Detection Error Rate; Delay; ECG signal; Feature Extraction; Power and Support Vector Machines.

  • Modelling Context Awareness in Internet of Things with Business Process Model and Notation 2.0 extensions   Order a copy of this article
    by Varun M. Tayur, R. Suchithra 
    Abstract: Internet of Things is starting to rapidly expand its reach into a multitude of real-world applications. The steadily increasing number puts the onus on using automated means to manage these devices and process efficiently. Enterprises and device developers are faced with a unique challenge to enable business processes to interact with objects from the world around us. Well-established business process modelling notations, such as Business Process Model Notation (BPMN) 2.0 really fall behind when it comes to modelling the real-world process requirements. Real-world processes are event driven, time and location aware, mobile, and often must perform some social behavioral tasks. Traditional business processes are rather static and mostly ignorant about the operating context. Most of the current work on extending business processes to model context concentrate on modelling location and expressing them as events. Business processes must adapt dynamically based on the context of execution. Modelling context awareness into the business processes includes expressing relations with the environment, resources, and humans; this is in addition to the Spatial, Temporal, and Environmental factors that influence business process. Specializations to DataObject, Task and Pool enable accurate modelling of real world processes based on the context, making them more reactive to the environment around them. A Home Automation use case is evaluated using the extensions proposed and it is found that the extensions enable accurate representation of the context data and tasks which can consume the context and perform contextual actions which otherwise would require complex logic based on events and gateways.
    Keywords: process automation; bpmn extensions; context data in bpmn; context tasks in bpmn; context pools in bpmn; context aware business process; dynamic workflows.
    DOI: 10.1504/IJAIP.2022.10039675
  • Comparative Study of Kernel Algorithms On SIMD Vector Processor for 5G Massive MIMO   Order a copy of this article
    by Ravi Sekhar Yarrabothu, Pitchaiah Telagathoti 
    Abstract: Currently world is moving towards achieving Gigabit data rates via 5G mobile revolution. Massive Multi-In-Multi-Out (MIMO) is one of the key enabler and recently lot of interest is evinced in this area. The efficiency of the algorithms to estimate and detect the channel plays a very crucial role for the success of Massive MIMO. The existing algorithms of LTE-A for this purpose are not efficient in terms of power consumption and lower latency, which is one of the foremost necessity of 5G communications. The biggest hurdle to achieve the ultra-low latency in 5G massive MIMO is - a very huge number of computations required for the matrix inversion while performing channel estimation and detection. In this paper, a comparative study has been done for two parallel processing schemes: Gauss-Jordan elimination and LU decomposition kernel algorithms on a single instruction multiple data (SIMD) stream vector processor for the realization of matrix inversion with optimum latency, which is the pre-requisite for the 5G channel estimation and detection. In this paper both matrix inversion algorithms Gauss Jordan and LU decomposition are analyzed and LU decomposition provides the required level of reduction of computational operations, which translates low latency and less battery power consumption.
    Keywords: Massive MIMO; SIMD; 5G ; DMRS; SRS; LTE - A.

  • Automatic Face Enhancement Technique using Sigmoid Normalization based on Single Scale Retinex (SSR) Algorithm   Order a copy of this article
    by A. Baskar, T. Gireesh Kumar 
    Abstract: The change of illumination affects visual quality of face image during image acquisition. It leads to inaccurate results in face detection and recognition techniques. Thus, pre-processing algorithms used in face detection techniques need a method to improve the visual quality of an image. This paper presents, an automatic face enhancement technique using sigmoid normalization based on single scale Retinex algorithm (SSR). The proposed work adapted smooth, continuous nonlinear sigmoid activation function as a normalization factor; it decreases the impact of outliers data. The algorithm comprises three steps. First, the colour selection module attained different colour space for an input face image. Subsequently, the SSR computes the reflectance values from the image in different colour space. Finally, the sigmoid normalization gets the reflectance image and achieves good dynamic range compression through contrast factor (C). The proposed algorithm is compared with Min-max based SSR, Sigmoid based enhancement, mean, median and histogram equalization. The performance of the proposed algorithm is validated in MUCT and IMM database using five quality metrics: 1. Mean Squared Error (MSE), 2. Normalized Absolute Error (NAE), 3.Average Difference (AD), 4.Peak Signal to Noise Ratio (PSNR) and 5.Normalized Cross Correlation (NK). The experimental result shows the proposed algorithm attains better enhancement in contrast factor (C) between 40 to 50 scales under different illumination.
    Keywords: Face Enhancement; Sigmoid Normalization; Single Scale Retinex; Min-Max Normalization; Illumination; Pre-Processing;.
    DOI: 10.1504/IJAIP.2021.10030669
  • Data Mining Techniques and Fuzzy Logic to Build a Risk Prediction System for Stroke   Order a copy of this article
    by Farzana Islam, M. Rashedur Rahman 
    Abstract: Nowadays, by using different computational system medical sector predict diseases. These systems not only aid medical experts but also normal people. In recent years stroke becomes life threatening deadly cause and it increased at global alarming state. Early detection of stroke disease can be helpful to make decision and to change the lifestyle of people who are at high risk. There is a high demand to use computational expertise for prognosis stroke. Research has been attempted to make early prediction of stroke by using data mining techniques. This paper proposes rule based classifier along with other techniques. The dataset is collected from Dhaka medical college, situated in Dhaka, Bangladesh To build a more accurate and acceptable model the system uses different classification methods likely- Decision tree, Support vector machine, Artificial neural network and fuzzy model. K-means, EM and fuzzy C-means clustering algorithm are used to label the dataset more accurately. Fuzzy inference system is also built to generate rules. ANFIS provides the most accurate model.
    Keywords: stroke; decision tree; SVM; MLP; artificial neural network; support vector machine; fuzzy model; FIS; ANFIS; data-mining; fcm; clustering; EM clustering; k-means; Bangladeshi dataset; fuzzy rule.

  • Optimization of Hopfield networks for storage and recall: A decade Review   Order a copy of this article
    by Jay Kant Pratap Singh Yadav, Arun Kumar Yadav, Divakar Yadav, Vikash Yadav 
    Abstract: Pattern storage and recall in efficient and effective manner is a prominent task in pattern recognition field. Recurrent (also called feedback) networks are most frequently used network that can store and recall the patterns. Recurrent networks have a capability of recalling noisy or partial patterns like brain. Detailed study of different neural network, it is found that Hopfield neural network outperform as compared to others. In this paper, we illustrate the review of a decade on optimization of Hopfield neural network to improve storage capacity and recalling of patterns.
    Keywords: Hopfield Neural Network; Genetic Algorithm; Cross-Association; Quad bit.
    DOI: 10.1504/IJAIP.2019.10026855
  • Clustering Related Behavior of Users by the use of Partitioning And Parallel Transaction Reduction Algorithm   Order a copy of this article
    by Thava Mani C, Rengarajan A 
    Abstract: High-speed development of information in associations in the present universe of business exchanges, broad information preparing is a main issue of Information Technology. Generally, an Apriori calculation is broadly used to discover the incessant thing sets from database. Later downside of the Apriori calculation is overwhelmed by numerous calculations yet those are likewise wasteful to discover visit thing sets from expansive database with less time and with awesome productivity. Henceforth another design is proposed which comprises of coordinated conveyed and parallel processing idea. The experiments are conducted to find out frequent item sets on proposed and existing algorithms by applying different minimum support on different size of database. With increased data set, Apriori gives poor performance as compared to proposed Partitioning and Parallel Transaction Reduction Algorithm (PPTRA). The implemented algorithm shows the better result in terms of time complexity and also handle large database with more efficiency.
    Keywords: Pre-processing; Mining of Association rules; frequent item sets; parallel; Apriori; matrix; minimum support; Partitioning.

  • A Study of Feature Reduction Techniques and Classification for Network Intrusion Detection   Order a copy of this article
    by Meenal Jain, Gagandeep Kaur 
    Abstract: The size of network data increasing tremendously, as the web technologies are emerging day by day. This huge amount of data contains large number of attributes which need to be analyzed for particular application. To analyze the significance of such attributes, different feature reduction techniques can be used. In this paper, three feature reduction techniques such as, Principal Component Analysis (PCA), Artificial Neural Network (ANN), and Nonlinear Principal Component Analysis (NLPCA) have been used to analyze the significance of such attributes. Three newly reduced datasets from the original benchmark dataset Coburg Intrusion Detection Data Set (CIDDS-2017), have been created after applying the above techniques. Four supervised learning based classifiers, namely, Decision Tree (DT), K Nearest Neighbor (KNN), Support Vector Machine (SVM), and Na
    Keywords: Principal Component Analysis; Artificial Neural Network; Nonlinear Principal Component Analysis; Decision Tree; K Nearest Neighbor; Support Vector Machine; Naive Bayes; Sensitivity_Mismatch_Measure; Specificity_Mismatch_Measure; Information_ gain.

  • Redundancy recognition in heavy weight structure with different parameters   Order a copy of this article
    by S. Sahunthala, A. Udhaya Kumar, Latha Parthiban 
    Abstract: Volume of data are transferred on the internet with the user defined format. Replica data and heavy weight structure process turn into problem if data is being processed in data ware house. It degrades the performance of query processing and occupies extra memory space. This paper analysis the data replica detection in the heavy weight structure with different parameters such as properties, entities, structure of heavy weight tree format and based on the number of computation process. In the existing techniques heavy weight structure is used to find the replica detection in the heavy weight structure. If the number computation is increased the performance is decreased when the query is processed .If the heavy weight structure absorbs huge amount of space it takes more time for query processing. The proposed technique - Light Weight Binary Duplicate Detection (LWBDD) works to get better outcome for query processing with detection of replica data in a hierarchical heavy weight structure. This technique also support with light weight process to generate good quality outcome than the existing approaches.
    Keywords: Replica detection; Properties; Objects; Heavy weight structure ,Light Weight structure; query process.
    DOI: 10.1504/IJAIP.2021.10034220
  • Load Balancing with Sip Protocol Using Distributed Web-Server System   Order a copy of this article
    by Harshavardhini Umapathy 
    Abstract: Web Server System describes an innovative algorithm for balancing the load by allocating the task to the clusters of SIP server. The requisition is given to the clusters of SIP servers which is developed via Session Initiation Protocol (SIP). This algorithm supports three main techniques for the locally distributed Web Server system collectively known as the very important queues such as CJSQ (Call Join Shortest Queue), TJSQ (Transaction Join Shortest Queue) and TLWL (Transaction Least Work Left) for the allocation of works to least values in the servers. This technique is incorporated with the knowledge of the SIP serverto recognize the variability among the call length and thus enhances in creation of a standard and dynamic estimate of load associated with backend server for initiating peculiar SIP transactions. The load balancer mainly helps in boosting of the throughput and response time used in SIP. The SIP is developed by knowing the growing perspective with the use for VOIP, IPTV, audio conferencing and online messaging. A comprehensive study of tenancy is obtained to how the SIP algorithms drastically diminish the reduced response time.
    Keywords: SIP; Load Balancer; DWS; SARA.

  • Adapted Rank Order Clustering-based Test Case Prioritization for Software Product Line Testing   Order a copy of this article
    by Satendra Kumar, Raj Kumar, Ashish Saini, Monika Rani 
    Abstract: Software Product Line Testing (SPLT) is a strenuous task due to the explosion of derivable products. It is infeasible to test all the products of a Software Product Line (SPL) so, several contributions have been presented to overcome this issue by reducing the number of products. However, not much consideration has been given to the test order of the products. Test Case Prioritization (TCP) technique arranges the test cases in a sequence to meet a specific performance goal. TCP is required to increase the effectiveness and efficiency of fault detection. In SPL, TCP technique arranges the configurations of products in order to be tested. Adapted Rank Order Clustering (AROC)-based TCP approach is proposed for SPLT. Our AROC method utilizes Binary Weight and Decimal Weight to arrange the products of an SPL. The results of the rigorous experimentation using AROC-based TCP approach are better than the random order and similarity-based order in terms of fault detection rate.
    Keywords: Software Product Line Testing (SPLT); Test Case Prioritization (TCP); Rank Order Clustering; Feature Model.

  • Transient Liveness Factor (TLF) an Attempt Towards Identification of Innate Fingerprint Under Varying Spoof Sample   Order a copy of this article
    by Akhilesh Verma, Vijay Kumar Gupta, Savita Goel 
    Abstract: Abstract. Earlier solutions to liveness detection were for the dataset with known attacks. Solutions are attack and sensor dependent constrained, designed selectively based on the taste of the researcher. Solutions appreciated initially, discarded later on based on issues of reproducibility and interoperability. Researchers resolved a wider context in next-level solutions that are more comparative as it incorporates input variance and up to some extent meet the expectation scope, by abstracting the solutions with a wider viewpoint. This work underlines the inherent limitations of current practices, specifically, those encountered under seen conditions, where limited training data is available. We envision this work towards the understanding model of innateness a non-separable reality of live samples instead of learning non-exhaustive spoof samples. Advance level solutions are under a new scope, which is need of today and future application. The work summarizes the spoof countermeasure to date, which is not enough generalized or a small step towards it. Highlights concepts and efforts for a near-ideal spoof detection system reliable to both known and unknown conditions. Considering future applications that will heavily rely on the trustworthy biometric-system, an urgent need to standardize the FPAD Fingerprint Presentation Attack Detection system is required. That is, interpretation & standardization of data sets, matrices, protocols pertaining to a realistic scenario. In this, regard, we put a proposal forward to model the liveness factor that is more suitable for future applications. We envision it as a service which transparently acts as an indicator of liveness across the variety of biometric application.
    Keywords: FPAD; Innateness ; Liveness; TLF.

  • EDC-LISP: An Efficient Divide-and-Conquer Solution to The Longest Increasing Subsequence Problem   Order a copy of this article
    by Seema Rani, Dharmveer Rajpoot 
    Abstract: The Longest Increasing Subsequence problem was initially viewed as an example of a dynamic approach and its major applications include the process of aligning whole genome sequences. We are presenting an optimal solution for the LIS problem using a modified divide-and-conquer approach with o(n log n) time complexity. The proposed method is more efficient and simpler than the earlier LIS solutions using D&C approach. Our approach does not require sorted data and it is more efficient and better than a sequential approach as we can solve the problem by dividing it into smaller subproblems. During the division phase, we do not need any prior knowledge about the length of the LIS the division process is simple and is independent of the type and range of the input sequence and the 'LIS'. We have implemented the proposed approach in C language using input sequences of different lengths ranging from 10 to 100000 elements.
    Keywords: Longest Increasing Subsequence (LIS); Modified Divide-and-Conquer (MD&C); First Row of Young Tableaux (FRYT); First Subproblem (FSP1); Second Sub-problem (SSP2).

  • PEBD: Performance Energy Balanced Duplication Algorithm for Cloud Computing   Order a copy of this article
    by Sharon Priya Surendran, Aisha Banu W 
    Abstract: With the increasing demand of cloud data, efficient task scheduling algorithms are required with minimal power consumption. In this paper, the Performance-Energy Balanced Duplication (PEBD) scheduling approach is proposed for energy conservation at the point of task duplication. Initially, the resources are preprocessed with the Manhattan distance based Fuzzy Clustering (MFC).Then resources are scheduled using a Novel duplication aware fault tolerant based League-BAT algorithm and faults expected during job executions can be handled proactively. The fault adaptive firefly optimization is used for minimizing faults and it keeps information about resource failure. Consequently, the optimization ensures that performance is improved with the help of task duplication with low energy consumption. The duplications are restricted and they are strictly forbidden if they provide significant enhancement of energy consumption. Finally, enhanced compress & Join algorithm is used for efficient compression processing. It considers both schedule lengths and energy savings to enhance the scheduling performance with less power consumption. The performance of energy consumption and makespan for the proposed approach is increased with 6% and 0.5 % respectively
    Keywords: Manhattan distance; Fuzzy clustering; Resource scheduling; Duplication; fault tolerance; energy conservation.

  • Friend discovery based on user's interest   Order a copy of this article
    by Kapil Sharma, Sachin Papneja, Nitesh Khilwani 
    Abstract: With the dawn of Web2.0 and Ontological semantic networks, Social Networking Platform popularity and usage has increased dramatically and has been a new area of research for both researchers and academician. Friend Recommendation, which is the one of the indispensable feature of Social media, has taken it to new height. Facebook, Twitter, LinkedIn, MySpace have captivated millions of users now a days. But the antecedent research work on Friend Recommendation cynosure on user current relation in Social Networking. Facebook, one of the most prominent social networking platforms provides the personalized friend recommendation based on FOAF (Friend of a Friend) ontology. MySpace is based on PYMK (People You May Know) friend recommendation. Basic perception behind it is that probability of a person knowing a friend of friend is more than unknown person. This Paper proffers a unique approach of friend recommendation based on the users interest and based on user current location. The main challenge with friend recommendation based on user interest is that user interest keeps on changing. To overcome this challenge, we have proposed recommendation System using Ontology and Spreading Activation. User interest is being captured using the Spreading Activation. Spreading Activation has been used to overcome variation in user interest. Our experimental results have shown the benefits of considering Spreading activation and ontology in friend recommendation in as social networking.
    Keywords: Ontology; Spreading Activation; Social Networking; Friend Recommendation.
    DOI: 10.1504/IJAIP.2022.10035628
  • L(2,2,1)-labelling problems on square of path   Order a copy of this article
    by S.K. Amanathulla, Madhumangal Pal 
    Abstract: $L(p,q)$-labeling problem is a well studied problem in the last three decades for its wide application, specially in frequency assignment in (mobile) communication system, $X$-ray crystallography, coding theory, radar, astronomy, circuit design etc. $L(2,2,1)$-labeling is an extension of $L(p,q)$-labeling is now becomes a well studied problem due to its application. Motivated from this point of view, we consider $L(2,2,1)$-labeling problem for squares of paths.rnrnLet $G=(V, E)$ be a graph. The $L(2,2,1)$-labeling of the graph $G$ is a mapping $eta:Vrightarrow {0,1,2,ldots}$ so thatrn$'eta(x)-eta(y)'geq 2$ if $d(x, y)=1$ or $2$, $'eta(x)-eta(y)'geq 1 $ if $d(x, y)=3$, where $V$ is the vertex set and $d(x, y)$ is the distance (i.e. minimum number of edges in the shortest path between $x$ and $y$) between the vertices $x$ and $y$. $lambda_{2,2,1}(G)$ is the $L(2,2,1)$-labeling number of $G$, which is the the largest non-negative integer which is used to label the graph $G$. In labeling problems of graph the main target is to find the exact value of $lambda_{2,2,1}(G)$ or to minimize it.rnrn In this paper we have studied $L(2,2,1)$-labeling of squares of paths and obtain a good result for it. rn Also a labeling procedure is presented to label a square of paths. The result of this paper is exact and also it is unique. This is the first result about $L(2,2,1)$-labeling of squares of paths.
    Keywords: Frequency assignment; L(2; 2; 1)-labeling; squares of paths.
    DOI: 10.1504/IJAIP.2022.10034134
  • An efficient tasks scheduling algorithm for batch processing heterogeneous cloud environment   Order a copy of this article
    by Dhirendra Kumar Shukla, Divya Kumar, Dharmender Singh Kushwaha 
    Abstract: Cloud computing has grown exponentially to provide services in the business and research communities over the last few years. It is an emerging field and has become more popular due to recent advancements in virtualization technology. The tasks requested by various clients are allocated to the available resources. Therefore, efficient scheduling of workloads in the heterogeneous environment is required. The available resources at the cloud data centers may be of different capacities. Due to heterogeneity, the scheduling of the tasks in the data center requires specifically taking care of available resources. The makespan reduction is the main objective of the scheduling in clouds. This paper proposes an approach for efficient scheduling of the task to the heterogeneous cloud data center in order to minimize the makespan. During the scheduling of the tasks to maintain the load balancing is also required. Hence the CPU utilization with respect to make span must be high. The approach proposed in this paper achieves both of the objectives with some extent. In this paper, proposed algorithm is tested on the benchmark datasets. Such as Braun experiments results reveal that the proposed algorithm reduces the overall makespan up to 24.69% as compared to other existing algorithms. Also, average resource utilization increases up to 4.29%. The proposed scheduling approach may be useful for CSP that usually require both executions of multiple tasks in a fixed time span.
    Keywords: Makespan; Average Node Utilization; Task Scheduling.
    DOI: 10.1504/IJAIP.2021.10027089
  • Improving Mobile Phone Payment Apps Security with QR Code Security   Order a copy of this article
    by Rijwan Khan, Shadab Ansari 
    Abstract: Nowadays, use of smartphones is increasing at a very fast rate. These phones have the capabilities of a small computers with the ease of doing almost all the tasks with a touch. In this way people will not be depend on the cash flow of money. Simple digital transactions will be there in place of cash flow. One of the major recent development in mobile phones is development of mobile banking systems, wallet systems and third-party payment applications. As the digital currency is now being widely used in market, there are a lot of mobile based application developed for digital payments. Almost all the banks are launching their apps with online payment options along with other facilities for their customers. In addition, there are some other players in market who are launching their mobile application based on wallets for such payments. In developing countries like India, digital payment plays a very important role in boosting the economy. Digital payments has shown a remarkable increase from year 2015 onward in India. In this paper, the authors have proposed a method for security testing of these applications. If an app is more secure, it gives a confidence to the users and will result in more users using this app. The QR code contains the details about the payer or payee and is extensively used by the current payment or wallet systems. Authors have proposed a method for securing the QR code security in mobile payment applications.
    Keywords: Mobile Security (MS); Visual Cryptography (VC); Mobile Payment (MP); Cyber Security (CS); Asymmetric Encryption (AE).

  • An encryption approach to improve the security and performance of data by integrating AES with modified OTP Technique   Order a copy of this article
    by Shikha Gupta, Satbir Jain, Mohit Agarwal, Nalin Nanda 
    Abstract: Maintaining the security and privacy of data is of paramount priority while using the network and communication systems, Cryptography is an art, used to solve the sole purpose of data security. Various cryptographic techniques are used to protect the data from vulnerabilities and potential attacks that occur during data transmission and data storage. Encryption serves as one of the most popular algorithms for cryptography. In this paper, a new encryption strategy is defined and implemented for boosting the security of data. The encryption strategy used to protect the data utilizes various encryption techniques such as Advanced Encryption Standard (AES)128-bit encryption, and that can be moved up to 256-bit encryption if needed along with modified One-time pad(OTP) encryption technique. A newly devised method for random key generation named as a Secure key generator (i.e. SKG) is also proposed to add more security to the whole system. Hence, the paper carries a detailed study of AES and OTP encryption technique as well as compare the results of modified OTP technique with existing encryption techniques.
    Keywords: Advanced encryption standard (AES); One Time Pad (OTP); Secure key generator (SKG); Encryption; Decryption.

  • Energy-Aware Multi-Objective Job Scheduling in Cloud Computing Environment with Water Wave Optimization   Order a copy of this article
    by Hima Bindu G B, T. Sunil Kumar Reddy 
    Abstract: Job scheduling is the process of assigning the jobs to the virtual machine based on their operations is sequential manner. Each operation is composed of set of instructions and has variable completion time. Virtual machines can execute single job at a time and the preemption is not possible at the time of job execution. Therefore, scheduling the job to the appropriate resource is a crucial task in cloud computing. Hence this paper intends to develop an advanced JSP in cloud environment using an enhanced water wave optimization (WWO) algorithm known as Control adaptive based WWO (CAP - WWO). Moreover the proposed scheme is compared with conventional algorithms and the results proved the efficiency of the proposed algorithm.
    Keywords: Job - shop scheduling; WWO; Execution time; Utilization rate; Throughput; Makespan.

  • Face Recognition using Local Binary Pattern and Gabor-Kernel Fisher Analysis   Order a copy of this article
    by Tulasi Krishna Sajja, Hemantha Kumar Kalluri 
    Abstract: Face recognition technology is one of the everyday tasks in our daily life. But, recognizing the correct face with high accuracy from large databases is a challenging task. To overcome this challenge, feature fusion of Local Binary Pattern (LBP) with Gabor-Kernel Fisher Analysis (Gabor-KFA) has proposed for face recognition. In this method, by using Gabor filter, extract Gabor features from a face image, on the other hand, extract features from LBP coded face image, then combined these extracted features generate high dimensional feature space. With this high dimensionality features, the complexity of training time and identification time may increase. To avoid this complexity, the Kernel Fisher Analysis algorithm was adopted to reduce the feature vector size. Experiments were conducted separately on Gabor features and also on fused features. To test the performance of the proposed approach, the experiments were performed on the IIT Delhi database, ORL database, and FR database.
    Keywords: Face recognition; Gabor filter; KFA; LBP; Fusion Method.

  • Applications of Nature-Inspired Meta-heuristic Algorithms: A Survey   Order a copy of this article
    by Avjeet Singh, Anoj Kumar 
    Abstract: Nature-Inspired Meta-heuristic Algorithms (NIMAs) combine the properties of nature-inspired and mete-heuristic algorithms. Meta-heuristic optimization addresses complex optimization problems through meta-heuristic algorithms. Evolutionary inspired and swarm-inspired algorithms are the most researched techniques for the developments of new nature-inspired algorithms. NIMAs are based on swarm intelligence, biological systems, and chemical systems. NIMAs are very efficient as compared to other optimization methods. These algorithms thoroughly explore the solution space to get a better optimal solution. Although not all NIMAs are efficient, yet a few algorithms have proved to be very effective and are popularly used as tools for solving real-world problems. NIMAs can be categorized into evolutionary-based, swarm-intelligence-based, physics-based, and human-based algorithms, depending on the sources of inspiration. This article surveys evolutionary inspired algorithms and swarm inspired algorithms. The article commences with a summary of NIMAs, followed by the literature survey of some popular evolutionary inspired and swarm inspired meta-heuristic algorithms. The article also specifies the domain and application areas of all the NIMAs.
    Keywords: Single and multi-objective optimization; meta-heuristic algorithms; heuristic algorithms; nature-inspired algorithms; swarm-based approaches.
    DOI: 10.1504/IJAIP.2021.10027703
  • Email Spam Detection using Bagging and Boosting of Machine Learning Classifiers   Order a copy of this article
    by Uma Bhardwaj, Priti Sharma 
    Abstract: The electronic mail system is one of the foremost communication systems for information users. With the increase in its popularity, the utility and significance of emails is also growing. Unfortunately, it has also led to the exposure of spam emails. Currently, email spam is one of the most concerned internet phenomenon exasperating millions of information users both at the personal and professional level. This paper endeavors to detect email spam by constructing an ensemble system using bagging and boosting of machine learning techniques. The dataset used for the experimentation is Ling-Spam Corpus. The system detects spam email by bagging the machine learning based Multinomial Na
    Keywords: Email Spam; Text Mining; Naïve Bayes; J48 Algorithm; Spam filtering; Correlation based Feature Selection; Bagging; Boosting.

    by Sasindra Reddy Dappili, Pavan Kumar Kosaraju, Mani Kanta Yetukuri, Prasanna Sai Bodduluri, E.T. Chullai 
    Abstract: Lower power margin is the most commonly occurred problem in turbo shaft engines which are used in helicopters. It is caused due to heavy taintings in air path which leads to fouling. This also leads to reduced air flow of compressor which results in lower power; few more reasons for lower power is damage of hot core components like power turbine blades and impeller. Low power margin is defined as reduction of output shaft power below the minimum required power to lift the helicopter. The engine encountered with low power is confirmed by pilot by measuring the torque with corresponding to altitude and ambient temperature. If the result is not coordinating with the requirements then engine is sent to test bed to find out the problem. The low power snag is confirmed by testing the engine in test bed, if power loss is within the acceptable limit then compressor wash is carried out through which 25% to 35% power is regained. After compressor wash if power is not regained then the engine is sent to Repair and Overhaul division where snag is rectified and sent back to test bed for final analysis. If engine regains the power, then the engine will be dispatched. The engine performance is analyzed through graphs mainly Power vs RPM. During testing, the parameters like power, mass flow rate, delta pressure, GG rpm, PT rpm, ambient temperature etc. are calibrated in test bed using FADEC system. In this paper we compare the power losses due to power turbine blade life cycle completion, impeller damage and chipping of blades found on axial compressors, rectification of snags, procedure followed to rectify the snags, final engine performance comparison and to confirm which snag because more power loss.
    Keywords: Turboshaft ,Compression Fouling; Corrosion ; Engine Testing.

  • Advanced Redistribution Meta Storage Algorithm for Securing Big Data in Cloud computing   Order a copy of this article
    by Akkipogu Vineela, N. Kasiviswanath, Shoba Bindu C. 
    Abstract: In the recent years, Big Data is one of the emerging fields to process the huge volumes of data. However, it faces lot of challenges in terms of security and storage. Implementing cloud computing with big data enriches the security and storage. A secure architecture is required to manage the big data in cloud. This paper proposed architecture for securing the big data using Advanced RedistributiOn MetA storage (AROMA) algorithm. The importance of the proposed approach is to provide Security-as-a-Service for big data storage. The major issue solved by the proposed approach is restricting the cloud service providers from directly accessing the users data. The experimental setup is conducted in the Amazon EC2 environment. The results proved the efficiency of the proposed method.
    Keywords: Big Data; Cloud; Security; Encryption; Storage.

  • Load-balanced Multilayered Clustering Protocol to Maximize the Lifetime of Wireless Sensor Networks   Order a copy of this article
    by Rohan Gupta, Arnab Nandi 
    Abstract: This article introduces an innovative clustering protocol for load balancing in Wireless Sensor Networks (WSNs). In the proposed protocol, square shape clusters of equal area are arranged in a multilayer fashion, and the base station is at the center of the network. The equal area of square clusters offers a nearly equal number of member nodes in each cluster which leads to comparable energy consumption at cluster heads for transmitting and receiving data from member nodes. This article also introduces a new routing approach in which hop selection is based on the difference of angle between the source and destination cluster heads with respect to a particular point. The efficiency of the proposed protocol concerning network lifetime and energy consumption is evaluated and compared with Low-Energy Adaptive Clustering Hierarchy (LEACH), Enhanced-Modified LEACH (E-MODLEACH) and Least Distance clustering (LDC). The efficiency of the proposed protocol is also evaluated for different optimization algorithms like GWO, PSO, and GSA. The proposed protocol is implemented with these algorithms during the cluster formation stage.
    Keywords: WSN; Clustering Protocol; Load Balancing; Network Lifetime; GWO; PSO; GSA; LEACH; E-MODLEACH; LDC.

  • Nature-inspired Query Optimization Models for Medical Information Retrieval with Relevance Feedback   Order a copy of this article
    by Aditya Jayasimha, Rahul Mudambi, Sowmya S Kamath 
    Abstract: Medical Information Retrieval is the problem of retrieving relevant medical-related information from a set of medical documents/knowledge base for a particular user query. As the volume of available medical records grows, the challenging problem to be addressed is determining those documents best suit a given query, in other words, which document is the most relevant for the user for a particular query ensuring highest user satisfaction. Statistical term weighting methods like Tf-Idf and probabilistic approaches like Okapi BM25 have been used to address this problem, but these suffer from several limitations. Bridging the gap between information need and user query is a crucial factor affecting user satisfaction which can be addressed through query optimization and relevance feedback. In this paper, we propose a document retrieval framework that incorporates query optimization to improve retrieval performance. Optimization techniques like Genetic Algorithm, Particle Swarm Optimization, and Global Swarm Optimization are used. Further, we also use relevance feedback methods to reformulate the user query for improving user satisfaction. The proposed techniques are applied to standard datasets with predefined relevance judgments for performing quantitative validation. Experimental results using the relevance judgements available in the University of Glasgow's Medline collection underscored the significant improvement achieved by optimization models using BM25 scores as the fitness function.
    Keywords: Medical Information Retrieval; Relevance Feedback; Meta-heuristic algorithms; Clinical IR; Genetic Algorithm; Particle Swarm Optimization; Global Swarm Optimization; Cosine Similarity score; Okapi BM25 score.

  • Student Absenteeism in Engineering College Using Rough Set and Data Mining Approach   Order a copy of this article
    by I. Samuel Peter James, P. Ramasubramanian, D. Magdalene Delighta Angeline 
    Abstract: Now-a-days student absenteeism in engineering education is a most important issue of a professional institution which affects the overall performance of institutions. This is most imperative alarm in creating outstanding engineers (real engineers) to the country. The quality of education is directly proportional to student absenteeism. A technique for analyzing the attributes that influence the subject on the total scores of the students using rough set theory (RST) was proposed by P.Ramasubramanian et al (2009). In this paper, the author investigates, why students are absent in most of the engineering institutions and to work out the decision making methods for grade options that manage students absenteeism by using RST. In this study, the faculty qualifications, experience, communication skills, subject knowledge, No. of guest lecturer/workshop/seminars conducted, college infrastructure, counseling, special coaching for weak students, subject understanding concepts and so forth, were collected from students are considered.
    Keywords: College-based Behavioral Issues; Data Mining; Education; Educational Data Mining; Engineering College; Quality Education; Quality Teaching; Rough Set; Student Absenteeism; Technical Institution.
    DOI: 10.1504/IJAIP.2021.10034139
  • Hybrid Local Descriptor for Improved Detection of Masses in Mammographic CAD Systems   Order a copy of this article
    by Devi Vijayan, Lavanya R 
    Abstract: Early detection of breast cancer increases the chances of survival considerably. Computer aided detection (CAD) systems are assistive tools which can render cost-effective, quick and objective decision making. The proposed work addresses the development of a CAD system to automatically classify suspicious regions in mammograms as normal and abnormal, aiming to improve the detection of masses. To this end, we propose to fuse two local descriptors namely, SIFT based bag of words (BoW) and uniform local binary pattern (ULBP) using principal component analysis (PCA). The effect of issues including obscured masses, typical of dense breasts, is alleviated by the use of segmentation-devoid local descriptors. Experiments on the benchmark DDSM database with multilayer perceptron (MLP) classification, on features extracted from 4316 suspicious regions demonstrate the efficiency of proposed system. The obtained result significantly reduces in false positives and false negatives, achieving an accuracy of 92.81% with an F1 score of 0.91.
    Keywords: bag-of-words; computer aided diagnosis; local binary descriptors; mammogram; scale invariant feature transform.

  • A Real-Time Novel Road Safety System Pertaining to Indian Road Conditions An innovative attempt.   Order a copy of this article
    by Sri Datta Budaraju, K.V. Shriram, Sucharitha V 
    Abstract: The report Road Accidents in India 2016 [10] published by the Ministry of Road Transport and Highways, Government of India, says that about 16000 accidents were caused by bad roads. During the rainy seasons, the visibility of the roads is hindered by the waterlogging. Thus, having an insight into the road conditions before taking a route becomes a necessity. Our work aims at providing the real-time road condition data and notifies the rider about any hazards on his route. The crowdsourcing model involves the drivers to gather the road condition data that is acquired analyzing the live IMU sensor data of their smartphones while driving. The road anomaly information is updated in the common cloud database and made is available for every other rider using the system. Using this information from the crowdsourced cloud database, our navigation system helps safely navigate the riders. The road hazard information can be then sent to concerned government authorities helping them to maintain the roads much more effectively. The accident information which is also acquired using the IMU sensors can be used to indicate the accident-prone zones and the accident intensities. This is a simple, efficient and low-cost solution in terms of development and deployment.\r\n\r\n
    Keywords: Android;Accelerometer;Cloud;Crash;IMU Sensors;Navigation;Potholes;Z-Thresh.

  • State of the Art Automatic Machine Transliteration Systems for Indic Scripts: A Comparative Report   Order a copy of this article
    by Sowmya Lakshmi BS, Shambhavi B R 
    Abstract: Due to the proliferation and reachability of social media and smart phones, number of internet users have increased on a significant scale. As a result of this globalization Internet and its users demand for provision of native languages over the internet and its applications is also increasing. Support for local languages in such applications can be achieved through Machine Transliteration and Machine Translation. Fair share of the data generated by users on the internet are a combination of native language and English words in Romanised or combination of English and native language scripts. The paper delivers an exhaustive study on machine transliteration systems used over a span of two decades for Indian languages. Review illustrates that traditional machine learning algorithms like Support Vector Machine (SVM), Conditional Random Field (CRF), Decision trees etc. yield excellent outcomes for strongly associated languages. Whereas, statistical approaches which depend on probability are best suited when either of the source or target language is phonetically rich. It is observed that performances have enhanced with combination of two or more algorithms.
    Keywords: Natural Language Processing (NLP) ,Transliteration ,Grapheme based approaches ,Phoneme based approaches ,Hybrid and Combination based approaches.

  • Case-Based Reasoning Methodology for eLearning Recommendation System   Order a copy of this article
    by Swati Shekapure, Dipti D. Patil 
    Abstract: Increasingly, eLearning has become a leading development trend in the industry. As far as the learning methodology is concerned, it has been observed that traditional learning methods such as teacher and student, chalk and duster have turned to modern & innovative learning. Due to a revolution in technology, everyone started learning by using the internet. They have been using devices like smartphones, laptops, e-books, I-pod and so on for gaining instructions. So, while they procure the learning they admit certain records, which are not significant to answer all their exploratory questions. Ultimately, there was a huge delay while scrutinizing the essential material on the internet, so there was an extremity to customize the search by acquiring certain information of a user to improve the search quality and save time. The recommended eLearning system is a case based system using a case-based reasoning approach and a distinct classification algorithmic rule to categorize the students learning interest. This system assembles student's learning preferences from a distinct discussion and systematically categorizes that characteristic into a learning standard.
    Keywords: Case-Based Reasoning; K Nearest Neighbor; Learning Style; Recommendation system.
    DOI: 10.1504/IJAIP.2022.10035296
  • Monitoring of environmental parameters using Internet of Things and analysis of correlation between the parameters in a DWC Hydroponic Technique   Order a copy of this article
    by P. Srivani, C.R. Yamuna Devi, S.H. Manjula, K.R. Venugopal 
    Abstract: Internet of Things (IoT) plays a major role in precision and sustainable agriculture system in urban farming. Hydroponic is one such modern technique where plant growth can be controlled and monitored with ease by integrating intelligence along with sensor technology. This research work presents the Hydroponic Deep Water culture (DWC) to grow Amaranthus Dibius in an indoor system, where the roots of plants are completely submerged in the nutrient solution. The environmental parameters that affect plant growth are monitored and gathered by employing IoT based technology integrated with some sensors. The parameters that are monitored are Humidity, Air temperature, Water (Root zone) temperature, EC and pH values through the sensor based technology using ESP32 with OLED display. Once the data is collected in the cloud, the dataset is analysed using correlation coefficient and a linear regression model to know the degree of relationship between the parameters. The system is able to show that there is a negative correlation between Root-zone Temperature and Electrical Conductivity (EC) value. Further, the most efficient microcontroller used compensates and reduces the space for using multiple- controller boards for different operations.
    Keywords: Hydroponic; Deep Water Culture; Internet of Things; Correlation Coefficient; Linear regression;.
    DOI: 10.1504/IJAIP.2022.10039907
  • An Efficient Implicit Lagrangian Twin Bounded Support Vector Machine   Order a copy of this article
    by Umesh Gupta, Deepak Gupta 
    Abstract: In this paper, an enhanced and improved version of Lagrangian twin bounded support vector machine (LTBSVM) is proposed termed as efficient implicit Lagrangian twin bounded support vector machine based on fuzzy membership with the dual formulation in order to reduce the sensitivity of different noise and outliers if present in the datasets. Here, the fuzzy membership values are determined according to the distribution of the samples. In this paper, we adopt the quadric and centroid fuzzy based approach for LTBSVM and propose Quadric based Fuzzy membership approach for LTBSVM (FQLTBSVM) and Centroid based Fuzzy membership approach for LTBSVM (FCLTBSVM). The problems make strongly convex by using L2- norm of the vector of slack variable unlike L1- norm of vector slack variable. Further, the solution of the problem is obtained through simple linear convergent iterative approach instead of solving a quadratic programming problem (QPP). Further, comparative performance analysis of proposed FQLTBSVM, FCLTBSVM with twin support vector machine (TSVM), twin bounded support vector machine (TBSVM) and LTBSVM has been done on different standard real-world datasets as well as artificial datasets using linear and Gaussian kernel. This analysis gives a prominent decision which announces that proposed approaches are more effective and applicable in terms of generalization performance and computational speed to other approaches. Our proposed approaches statistically validate and verify based on various parameters such as accuracy, sensitivity, recall, precision, F1-score and G-mean.
    Keywords: twin support vector machine; twin bounded support vector machine; Lagrangian function; iterative approaches.

  • On Primes, Semiprimes and their Applications   Order a copy of this article
    by Venkataraman Yegnanarayanan, Gayatri Narayana YEGNANARAYANAN, Veena Narayanan Narayanan 
    Abstract: The existence of hidden symmetries in the distribution of prime numbers and the open problems such as given a semiprime, how fast we can find its prime factors and all even integers that are adequately large could be realized as a sum of either two prime numbers or a prime and a semiprime has driven us to probe the significant role of the Catalan numbers in semiprime decomposition. We proved some results concerning distribution of primes and semiprimes in terms of functions of primes.
    Keywords: Primes; Semiprimes; Euler totient function; Catalan numbers; Cryptographyrnrn.

  • An efficient memory based differential evolution for constrained optimization   Order a copy of this article
    by Raghav Prasad Parouha 
    Abstract: In optimization, the performance of differential evolution and their hybrid versions exist in literature is highly affected by the inappropriate choice of its operators like mutation and crossover. In general practice, during simulation DE does not employ any strategy of memorizing the so-far-best results obtained in the initial part of the previous generation. In this paper, a new Memory based DE (MBDE) presented where two swarm operators have been introduced. These operators based on the p^best and g^best mechanism of particle swarm optimization. The proposed MBDE is employed to solve CEC2006 and CEC2010 constrained benchmark functions. The results of MBDE are compared with state-of-the-art algorithms. Numerical, statistical and graphical analysis reveals the competency of the proposed algorithm.
    Keywords: Differential Evolution; Particle Swarm Optimization; Mutation; Crossover; Elitism; Constrained optimization; Elitism.

  • Aspect-based Opinion Mining of Customer Reviews for Product Usability Evaluation using Natural Language Processing   Order a copy of this article
    by Ajay Kumar, Pradip Swarnakar, Saurabh Maurya 
    Abstract: With the commencement of Web 2.0, more appraise started to be given to the active engagement of users and communities on the web, primarily through the knowledge contribution of each member to supplement global information. Opinion Mining (OM), also known as sentiment analysis (SA), is a field of Web Content Mining that targets to find out valuable information from users' opinions. More often, potential buyers always look for feedback from other users based on their judgment and experiences about the product. This helps them in making an informed decision. But the task of analyzing the significant number of reviews present on the Internet may be tiresome and time consuming for a person. So, this study proposes a model by which the products will be analyzed, and ratings will be given to the individual feature of the product based on the reviews. Thus the comparison among various products from both subjective and objective perceptions on the feature level is performed. A user looking for a specific set of features in a product can quickly analyze the products and compare this product to others based on the required features. The system will also recommend products to the user based on his requirements. This study uses the Natural Language Processing toolkit provided by Stanford to mine the opinion words and corresponding feature terms using dependency grammar and feature list. The results of the proposed model are presented to the user in a relaxed and understandable manner.
    Keywords: Sentiment Analysis; Web Mining; User Perception; Natural Language Processing; Opinion Score.

  • A Community Based Trusted Collaborative Filtering Recommender Systems Using Pareto Dominance Approach   Order a copy of this article
    by Anupama Angadi, Satya Keerthi Gorripati 
    Abstract: Recommender System algorithms provided clarification to information overload problem suffered by netizens. The Collaborative Recommender Filtering approach takes the user-item rating matrix as an input and recommends items based on the perceptions of similar neighbours. However, sparsity issue in the rating matrix leads to untrustworthy predictions. However, the conventional Collaborative Recommender Filtering method chooses ineffective descriptive users as neighbours for each target user. This hints that the recommendations made by the system remain inaccurate. The proposed approach addresses this issue by applying a pre-filtering process and integrates community detection with Pareto dominance, which considers trusted neighbours from the community into which the active user pertains and eliminates dominant users from the neighbourhood. The results on the proposed framework showed a noteworthy improvement in all the accuracy measures when related to the traditional approaches.
    Keywords: Community Detection; Recommender Systems; Sparsity; Pareto dominance; Cold Start; Trust propagation;.

  • An Efficient Data Aggregation with Optimal Recharging in Wireless Rechargeable Sensor Networks   Order a copy of this article
    by S. Pavithra, P.M. Anu 
    Abstract: Energy consumption is optimized for transmission of packet in the clustering network there are base station selected to multiple mobile nodes covered under the particular distance range and limited distance to hit and elect the cluster group from the sink region using clustering algorithm. Cluster Head is formed based on the highest connectivity of nodes. Then each sub nodes are connected to the group leader based on assigned node id, energy level to be maintained. Each head connected to another cluster head via sink to be presented. Now packets are transmitted under each mobile nodes to cluster head region. So, index table maintenance of unique file gathering from cluster head to sink and sink to cluster communication to be performed. If sink has buffered of packet occurs, then cluster head to cluster head communication can be established and improve the network lifetime, throughput performance, energy consumption, less cost transmission and delay to be reduced.
    Keywords: Rechargeable sensor network; energy consumption; Balanced energy allocation scheme.
    DOI: 10.1504/IJAIP.2022.10040244
    by Vijayakumar K, Dafni Rose J 
    Abstract: Document Summarization is the process which condenses the given document to generate a summary which captures the main essence of the entire document. In recent years, there has been increased interest in automatic summarization. Automatic summarization refers to summarizing a document using software and it helps to reduce large text documents to a short set of words or a paragraph that delivers the main meaning of the full text. The extracted features from the documents are used for the automatic summarization process and remain a successfully proven approach but it leads to drawbacks with respect to structure, redundancy, coherence. Existing methods for single document summarization usually make use of only the first sentence or fixed number of words from the beginning contained in the specified document. This paper proposes a technique that uses contents of the entire document to provide more knowledge to help single document summarization. The proposed system mainly aims at generating a summary of at least a minimum length unlike the existing system that generates empty summary if it couldnt find the keyword present in the input document which meets the attention weight beyond a threshold. Also, the proposed system is focused in maintaining the structure of the summaries generated for the given document.
    Keywords: text; summarize; document; recurrent.

  • An Efficient Composable 1-out-of-2 Oblivious Transfer Scheme using Vector Decomposition   Order a copy of this article
    by Praveen I, Aravind Vishnu S S, Sethumadhavan M 
    Abstract: Internet has revolutionized communications, to the extent that it is now one of the most preferred medium of everyday communication. Even though the messages sent and received are not accessible to a third party, by continuous tracking of the websites accessed by the users, it is possible to identify their character and interests, thus increases the risk on privacy. Oblivious transfer is one of the best solutions to this problem. A k-out-of-n oblivious transfer scheme is the interaction between a receiver and a sender where the sender possesses the messages m1, m2, The receiver need to access k messages out of the n messages. The scheme is a mechanism in which the receiver obtains no more than the messages m1, m2, which the receiver queried for with the indices 1, 2, ..., k being oblivious to the sender. We put forward a 1- out-of-2 oblivious transfer scheme. The security of our scheme is based on decisional subspace assumption, decisional linear assumption and the computational infeasibility of vector decomposition problem a.k.a VDP assumption. Our scheme uses points on elliptic curve as the ciphertext. Further we prove the security of the proposed construction in the universal composability framework by using FCRS? hybrid model.
    Keywords: Oblivious Transfer; Vector Decomposition Problem;Pairings; Universal Composability.

  • Web Server Workload Prediction using Time Series Model   Order a copy of this article
    by Mahendra Pratap Yadav, Akanksha Kunwar, Ram Chandra Bhushan, Dharmendra Kumar Yadav 
    Abstract: In distributed systems, multi-tier storage systems and cloud data-centers are used for resource sharing among several clients. To fulfill the clients request, the cloud providers share it's resources and manage the workload, which introduces many performance challenges and issues. One of the main challenges is resource provisioning in virtual machine (VMs or Container) since VMs are subjected to meet the demand of users with different profiles and Quality of Service (QoS). This proactive resource management approach requires an appropriate workload prediction strategy for real-time series data. The time series model exhibits prominent periodic patterns for the workload that evolves from one point of time to another with some short of time in random fluctuation. In this paper, a solution for the prediction of web server load problem has been proposed, which is based on seasonal ARIMA (Autoregressive Integrated Moving Average Model) model. ARIMA is a forecasting technique which predicts the future value based on its inertia. In seasonal ARIMA, seasonal AR and MA are used to predict the value xt (CPU workload time series) with the help of data values and errors at time lags that are multiple to the span of seasonality. We have evaluated our proposed method using real-world web workload data.
    Keywords: Cloud Computing; Elasticity; Auto-scaling; Time Series; Machine Learning.
    DOI: 10.1504/IJAIP.2022.10034175

Special Issue on: ICASISET 2018 M-learning Applications for Future Mobile Networks

  • A visualization technique of extracting hidden patterns for maintaining road safety   Order a copy of this article
    by C> Selvarathi, S. Subha, G. Madasamy Raja, K. Vidhya Lakshmi 
    Abstract: Predictive analysis is a technique in the data mining which deals with extraction of hidden information from the collection of data and then it is used to predict trends and future behavior patterns. Predictive analysis extracts the relationship between explanatory variables and predicted variables from the past occurrences and exploits those variables to predict future unknown outcome. This research paper is designed to enable traffic management controllers to use historical traffic and road accident data, to observe how traffic congestion and accidents are created in the roadways. The information about the events which lead to the happenings of accidents is collected and a database is created. These data are analyzed through data mining techniques to predict the traffic congestion and accidents. The main contribution of this paper is a visualization technique to predict the heavy traffic and highlight the history of the accidents. In this paper we introduce a risk index value that represents the history of accidents and predicts the accident prone areas. Visualization represents the accident-prone areas with different color by considering the risk index value. It is known that working with maps in real-time is the best way to reduce the potential accidents and so this study uses maptive tool for creating visualization. By visualizing and analyzing the data across the road using maps, we can predict the probability of accidents in a location and can avoid the travelling in those paths.
    Keywords: Predictive analytics; Road accident data mining; Machine learning techniques; Maptive tool; Random-forest method; Data Visualization.
    DOI: 10.1504/IJAIP.2018.10042067

Special Issue on: IFSCOM 2018 Extensions to Type-1 Fuzzy Sets

  • Fuzzy Multi-Attribute Decision Making Approach for the Selection of Software Effort Estimation Models   Order a copy of this article
    by Ashu Bansal, Neeraj Gupta, Rakesh Garg 
    Abstract: Evaluation and selection of the several alternatives with respect to numerous contradictory decision parameters is highly defied to crack many problems. Multi-attribute decision making (MADM) is highly used for solving many decision making problems in almost all areas of engineering and management. Numerous MADM methods have been established by researchers for resolving such problems in a precise way. The distance estimation for all alternatives from an optimal/ideal point is the common idea that is implemented in many of the methods. To deal with the ambiguity that may occur in this decision making course, Fuzzy Set Theory (FST) is widely used. In this research, the existing Distance based Approach (DBA) is extended first time with Interval type-II fuzzy sets. This extended approach, i.e. Interval type-II fuzzy distance based method (ITFDBM) is further implemented for a very challenging problem i.e. selection and ranking of Software effort estimation models. The results obtained from this study depict that ITFDBM is highly efficient and stable to solve such complex MADM problems.
    Keywords: Multi-attribute decision making; Interval type-2 fuzzy sets; IFTDBM; Software effort estimation model.

  • An optimized fuzzy edge detector for image processing and their use in modular neural networks for pattern recognition   Order a copy of this article
    by Isidra Espinosa-Velazquez, Patricia Melin, Claudia Gonzalez, Frumen Olivas 
    Abstract: In this paper, the development of a fuzzy edge detector optimized with the metaheuristics: Genetic Algorithms and Particle Swarm Optimization is presented, based on the sum of differences method, using as inputs the absolute values of the difference from the pixels in the image. The Pratts figure of merit metric was used to know the performance of the proposed fuzzy edge detector. A modular neural network was designed for the recognition of faces in benchmark images and comparisons were made with different works carried out with other fuzzy edge detection systems. The main contribution of this research work is the development of a new fuzzy edge detector method optimized.
    Keywords: fuzzy logic; fuzzy edge detector; optimization; GA; genetic algorithm; PSO; Particle swarm Optimization; Neural networks.

Special Issue on: Data Centre Networking and Cloud Computing

  • Effective Resource Monitoring and Management system for Cloud Environment   Order a copy of this article
    by M. Dakshayini, Rekha P M 
    Abstract: Cloud computing is an emerging technology for sharing many resources such as memory, storage and CPU etc. Utilization of these resource may vary due to the adaptive nature of the cloud computing. Effective management and continuous monitoring of these resources throughout the time is essential in making the Cloud computing system more successful. Hence, in this work, an effective virtual machine Resource Monitoring and Management system using Software Defined Networking approach is proposed and implemented using mininet. This allocates the jobs to the most suitable virtual machines and thereby achieving the effective reduction in the response time for the cloud users. Results obtained have proved the efficiency of the proposed approach with improved CPU utilization, throughput, and reduced response time. Comparison of the proposed algorithm with the existing algorithms has shown the improved Cloud system performance with the proposed approach.
    Keywords: Resource; Monitor; Software defined Network; system performance; CPU; Throughput.

  • Detection of Phishing Websites using Data mining Tools and Techniques   Order a copy of this article
    by Mansi Somani, Mamatha Balachandra 
    Abstract: Phishing is one of the most common attack to obtain users sensitive information. It is one of the major cyber-security issue. To eradicate Phishing attacks it is important that the users or software is able to detect it first. A popular approach to carry out Phishing attack is through generating phishing URLs. A URL could be legitimate or phishy which fits Phishing into a perfect classification-type problem in Data Mining. Hence, data mining algorithms and techniques can be used for effective classification of websites as legitimate or phishy. Algorithms- C4.5 (J48), SVM, Random Forest, Treebag and GBM have been trained to carry out a comparison on measures- accuracy, recall and precision which will help in determining the most suited model for detection of phishing website. Rules have been listed that categorise the features which make a website phishy or legitimate. Work has been done using RStudio. The programming language employed is R. The dataset used comprises of 11055 tuples and 31 attributes. It is trained, tested and used for detection. Among the 5 classifiers used, the best accuracy has been obtained through the Random Forest model which is 97.21%.
    Keywords: Phishing; Security; Data mining; Features; Classifier; Accuracy; Precision; Recall; Confusion matrix.

  • Energy Efficient Hybrid ARQ Scheme for Cooperative Communication in Underwater Acoustic Sensor Networks   Order a copy of this article
    by Goutham Veerapu, Harigovindan V.P. 
    Abstract: In this article, we present an analytical model to compute the energy efficiency of Underwater Acoustic Sensor Networks (UASNs) in deep and shallow water scenarios. Acoustic parameters such as, distance-dependent usable bandwidth, acoustic fading and different propagation characteristics of underwater channel are taken into account in this model. Further, we propose a Hybrid Automatic Repeat Request scheme for Cooperative Communication (HARQ-CC) in UASNs. It combines Reed-Solomon codes and selective retransmissions in cooperative communication to optimize the energy efficiency. The analytical results clearly indicate that HARQ-CC can improve the energy efficiency when the source and destination nodes are separated by larger distances. We propose an optimization algorithm to maximize the energy efficiency by adaptively adjusting the modulation level and packet size as a function of the distance between the source and destination nodes. It is observed that by using the optimization algorithm, the energy efficiency has been further improved for HARQ-CC, leading to highly reliable and energy efficient HARQ-CC scheme for UASNs.
    Keywords: Cooperative Communication; Reed-Solomon Codes; Energy Efficiency; Hybrid ARQ.

  • Prediction based Spatial Correlation Clustering Algorithm for Efficient Data Reduction in Wireless Sensor Network   Order a copy of this article
    by Raja Periyasamy, Deepa T 
    Abstract: Wireless Sensor Network (WSN) adapted in wireless communication for many application area due to its low computational cost. But, WSN has limited lifetime due to its energy conservation during communication between the source and the sink nodes. This energy conservation problem is reduced by the data reduction technique which decreases the unnecessary data transmission. Also the spatial correlation arises in WSN for larger area. Therefore, the scheduling based spatial correlation is used to increase the coverage area and decrease the energy consumption however, packet loss occurred during data transmission due to its similar magnitude. Hence, the Prediction based Spatial Correlation Clustering (PSCC) method is proposed for efficient data reduction and maximizing effective area. Then the results analyzed by simulation illustrate that the proposed method achieves greater enhancement in convergence speed, which provide high data reduction and less energy consumption compared with the existing approaches.
    Keywords: Data reduction; Energy efficiency; Prediction method; Spatial correlation; Wireless sensor network.

  • Density- Aware Replica Server Placement for Utilization Enhancement   Order a copy of this article
    by Darothi Sarkar, Nitin Rakesh, Krishna Kumar Mishra, Anupama Mehra 
    Abstract: The most challenging issue in Content Delivery Networks (CDN) is to place the surrogate servers. K- Means clustering can be chosen as a very simple and effective approach for placing the surrogates over the network. But the main concern of this approach is that, it may produce clusters with very few nodes or no nodes. In CDN, this scenario will lead to a situation where a surrogate is serving no clients. This paper introduces a parameter called population threshold for each cluster and also optimizes the number of surrogates by restricting the placing of servers in those under populated clusters. This work also identifies nearby servers which can serve the requests from clusters with population below the defined threshold, depending upon two parameters utilization factor and strength of traffic load of the surrogates.
    Keywords: Content Delivery Network; Utilization factor; Traffic Load; Population Threshold; Deployment cost.
    DOI: 10.1504/IJAIP.2022.10020776
  • Energy Efficient VM Placement and Task Scheduling Techniques to achieve Green Cloud Environment   Order a copy of this article
    by B.S. Rajeshwari, M. Dakshayini 
    Abstract: Continuous increase in the demand for cloud services has resulted in increased workloads. Effective distribution of these augmented tasks on a limited infrastructure for providing the services is a challenging issue. This necessitates servers to run continuously raising the power usage in the data centers. High power consumption results in releasing major amount of CO2 gas to the environment, polluting our natural environment. So, there is a need to think of green cloud computing to reduce this effect of harmful gas to our environment. Designing an energy efficient data center is a perennial challenge. Hence in this paper, optimal virtual machine placement strategy using Knapsack Dynamic Programming technique and an efficient precedence-based task distribution approach are proposed to alleviate the power consumption and achieve the quality of service maintaining the service level agreements. The implementation results obtained have proved its power efficiency with better quality of service.
    Keywords: Virtual Machine Allocation; Power Consumption; Cloud Computing; Service Level Agreement; Load Distribution.

  • A novel Framework for Insurable Archival Services: A Protocol to create secured marketplace for Archival Services using Cloud   Order a copy of this article
    by A.H. Shanthakumara, N.R. Sunitha 
    Abstract: Archival service is a key element not only of archiving personalrndocuments, but is now considered as an investment by treating archival document is a valuable property. The demand has evolved with the current technology that has grown rapidly in recent years to provide high capacity storage, low cost maintenance and high security for a long period of time. Nowadays archival needs of users became more advanced and distinct. Some users need very long term preservation and others require high security. There is no specific solution seems to accommodate all need of the archival users. For a better archival service, a driver is needed to bring change in the marketplace. This paper proposes a novel framework to include insurability as a value added service and come up with a way to consolidate the current archival business forum through proper drivers in the dynamic marketplace. This paper also presents a provably secure and practically efficient protocol, which provides secure, simple and efficient mechanism to address all stakeholder needs of the current archival system.
    Keywords: Archival Data; Insurability; Archival Services; Long Term Preservation.

  • Effective Hybrid Feature Subset Selection for multilevel Datasets using Decision Tree Classifiers   Order a copy of this article
    by Dinakaran S, Ranjit Jeba Thangaiah P 
    Abstract: Feature selection is one of the most significant procedures in machine learning algorithms. It is particularly to improve the performance and prediction accuracy for complex data classification. This paper discusses a hybrid feature selection technique with the decision tree based classification algorithm. The feature selected using Information Gain (IG) is combined with the features selected from ReliefF which generates the resultant feature subset. Then the resultant feature subset is in turn combined with a Correlation-based Feature Selection (CFS) method to generate the aggregated feature subset. To perform classification accuracy on the aggregated feature subset, different Decision trees based classification algorithm such as C4.5, Decision Stumps, Naive Bayes Tree, and Random Forest with ten-fold cross-validation have been deployed. To check the prediction accuracy of the proposed work eight different multilevel UCI (University of California, Irvine) machine learning datasets have been used with minimum to maximum numbers of features. The main objective of the hybrid feature selection is to improve the classification accuracy, prediction and to reduce the execution time using standard datasets.
    Keywords: Feature selection; Decision tree; Information Gain (IG); ReliefF; Correlation based Feature Selection (CFS); Naïve Bayes Tree; Random forest; C4.5,Decision Stump; Exclusive OR; Intersection; Ranker.