Forthcoming articles


International Journal of Advanced Intelligence Paradigms


These articles have been peer-reviewed and accepted for publication in IJAIP, but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.


Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.


Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.


Articles marked with this Open Access icon are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.


Register for our alerting service, which notifies you by email when new issues of IJAIP are published online.


We also offer RSS feeds which provide timely updates of tables of contents, newly published articles and calls for papers.


International Journal of Advanced Intelligence Paradigms (218 papers in press)


Regular Issues


  • Statistical Pair Pruning towards Target Class in Learning Based Anaphora Resolution for Tamil   Order a copy of this article
    by Arul Deepa Kannan, Deisy C 
    Abstract: Anaphora resolution is an important task to be achieved in many NLU (Natural Language Understanding) applications including machine translation. This paper proposes learning based system to resolve pronouns in Tamil text built around various classification algorithms. To improve learning accuracy, the system is built in two folds. First is feature vector production where mentions are identified, characterized then a feature vectors of lexical, syntactic and semantic features are produced. Next is the pair pruning module where, number of non-target class pairs is reduced by deep statistical analysis of feature vector. Incorporating deeper pair pruning module dramatically increases the f-measure score when compared to training the same models but without the pruning module. On the tourism dataset of TDIL we trained the system with various classification algorithms and obtained encouraging results for a challenging language Tamil. We discuss how varying the ratio of f-measure, precision and recall is between with and without the pruning module in comparative model.
    Keywords: anaphora resolution; classification; machine learning; pronoun resolution; Tamil computing; co-reference resolution; Natural Language Understanding; Natural Language Processing.

  • Particle Swarm Optimization based Parameters Optimization of PID Controller for Load Frequency Control of Multi-area Reheat Thermal Power Systems   Order a copy of this article
    by Jagatheesan Kaliannan, Anand Baskaran, Sourv Samanta, Nilanjan Dey, Valentina E. Balas 
    Abstract: This work presents Load Frequency Control (LFC) of multi-area reheat thermal power system with Proportional-Integral-Derivative (PID) controller. The interconnected control areas are provided with a single stage reheat turbine in all areas. The Proportional gain (Kp), Integral gain (Ki) and Derivative gain (Kd) values of the PID controller are simultaneously optimized using more recent and powerful evolutionary computational intelligence technique, namely the Particle Swarm Optimization (PSO) algorithm. The superiority of the proposed PSO based PID controller has been demonstrated by comparing the performance to the recently published modern optimization techniques such as Hill Climbing (HC) algorithm and Genetic Algorithm (GA) tuned controllers for the same multi-area thermal power system. For the analysis, the time domain specification and one percent Step Load Perturbation (1% SLP) are considered in thermal area 1. The simulation result shows that the proposed PSO based PID controller provides superior dynamic response over other optimization technique (HC and GA) based PID controller.
    Keywords: Load Frequency Control (LFC); Proportional-Integral-Derivative (PID); evolutionary computational intelligence ; Optimization; Hill Climbing (HC) algorithm and Genetic Algorithm (GA); Particle Swarm Optimization (PSO).

  • Automatic Generation Control of Thermal Generating Unit using Evolutionary Controller   Order a copy of this article
    by Ashish Dhamanda 
    Abstract: This paper obtain the dynamic response of deviation in load frequency and corresponding tie-line power of an automatic generation control (AGC) in three area interconnected thermal power system by using two different controller; One is fuzzy logic Controller and two is Evolutionary (GA for tuning of PID Controller) Controller. In this paper Evolutionary controller is proposed for improving the performance of load frequency and tie-line power deviation and their dynamic responses are compared with the intelligent controllers responses. The results indicate that the proposed controller exhibit better performance and satisfy the automatic generation control requirements with a reasonable dynamic response. The performances of the controllers are simulated using MATLAB/SIMULINK 2013a ( software.
    Keywords: Proportional plus Integral plus Derivative (PID); Genetic Algorithm (GA); Automatic Generation Control (AGC); Fuzzy logic; Evolutionary Controller.

  • Analyzing Effect of Multi-Versioning for Software Updates on Reliability: An Utility following Pheromone Trail of Social Insects   Order a copy of this article
    by Nishi Kant Kumar, Soumya Banerjee 
    Abstract: Software evolves, with new versions and patches being released frequently have always been a challenge. The conventional users refuse to upgrade their regular software application, relying instead on outdated versions flawed with vulnerabilities or missing useful features and bug fixes. Software engineering community improvises such requirement for version migrations and also is looking for analytical support system to scrutinize post convergence of version control. Hence, this paper presents a novel bio-inspired utility to analyze post convergence of version control of software applications and automatically tests each submitted patch, looking for potential bugs it introduces. The bio-inspired phenomena signifies a pheromone deposition and evaporation property of social insects and it demonstrates that how it largely effects the optimization of local minima problem across the path of social insects. Similarly, here, the version control effect has been modeled with its parameters, which can yield useful postversion control paradigm, and may assist the software community with this developed application plug-in.
    Keywords: Multiple Version; Version Control; Social Insect; Pheromone trail; Post Version Control Analytics rnrnrn.

    by Rama Rao K V S N, Sudheer Kumar Battula, Lakshmi Siva Rama Krishna Talluri 
    Abstract: As the dependence on network for day to day transactions has been increasing tremendously, protecting critical information resources from intruders has always been the thrust area in the research. Building Intrusion Detection System(IDS) for an enterprise is a complex and challenging task as the attack types are many and growing day by day. Hence there is a need for a smart Heuristic Scanner in an IDS to perform deep packet inspection in order to detect newer form of attacks and decisively declare a source as trusted or un-trusted. To perform deep packet inspection, packet headers at Transport and Network layers are captured which are considered as heuristics and are processed through a two level machine learning classifiers. Transmission Control Protocol(TCP) and Internet Protocol(IP) are Transport layer and Network layer protocols respectively. In the first stage, Naive Bayes is applied on selected TCP level heuristics. The output of the first stage classifier and heuristics at IP level are given as input to the k-nearest neighbors (KNN) classifier in the second stage. At the end of the second stage classification, results are rendered as trusted or un-trusted. The experimental results showed that the proposed approach is efficient in terms of detection rate and false alarms.
    Keywords: Machine Learning;IDS;Heuristic;Classifiers;TCP/IP Packets.

  • Development of Smart RFID Reader for Generous Application   Order a copy of this article
    by Waqas Malik, S.M. Usman Ali Shah 
    Abstract: The pace of growth in technology is rapidly increasing therefore simpler and effective methods are required to design a system. The real time smart RFID system is developed to restrict the entrance of employees in an office. In this paper the work mainly specifies to develop an android application interfaced with Bluetooth, RFID reader and antenna. The RFID is more suitable replacement with conventional bar code method. Every Tag (passive) is allocated with a unique binary code. For communication Frequency shift keying modulating method is used. The LF operating frequency of reader is 131.2 kHz for lower bit and 123.2 kHz for higher bit. The LF band of frequency is obtained by TMS 3705 IC and antenna of copper enameled wire with calculated specification such as number of turns and thickness of the wire. For binary bit verification interfacing between IC and controller is part of this system. The relay operation is controlled and maintained the record on an android application. It can be utilized for generous application.
    Keywords: Smart RFID; Reader; Antenna design.

  • On Domination and Total Domination Number of a Collaboration Graph   Order a copy of this article
    by Venkataraman Yegnanarayanan, Renuka Lakshmi, Valentina E.Balas 
    Abstract: We determine here the domination and total domination number of Rolf Nevanlinna Prize winners (1982-2014) collaboration graph-RNPCG besides giving a brief introduction about the prize, the method of construction of the collaboration graph, a software tool used for drawing the collaboration graph etc.
    Keywords: Erdӧs Number; Rolf Nevanlinna Prize; Domination Number.

  • A Fully Tuned RACS For Flight Control System   Order a copy of this article
    by Chandra Sekhar Mohanty, Patha Sarathi Khuntia, Debjani Mitra 
    Abstract: This paper presents the design of a Roll attitude control system (RACS) based on Bacterial Foraging Particle Swarm Optimized (BFPSO) Proportional Integral and Derivative (PID) autopilot that controls the roll angle of an aircraft. The most important function of any aircraft flight control system (AFCS) operating on lateral motion is to attain a high degree of spiral stability. To achieve the high degree of spiral stability desired in roll control requires the use of RACS. The RACS uses feedback mechanism to maintain the roll attitude in presence of disturbances and responds rapidly and accurately to roll commands from the pilot. But the conventional RACS has very poor performances. Bacterial foraging optimization (BFO) and particle swarm optimization (PSO) is the topic of recent researches. Both are population based techniques and inspired by social behavior of biological organism. In this paper a new algorithm is proposed which combines the advantages of both PSO and BFO to form a new algorithm called BFPSO. In this paper the parameters such as Kp, Ki and Kd of PID based RACS controller are tuned by BFPSO algorithm to achieve the dynamic stability as required by the pilot while spinning an aircraft. For comparison point of view RACS based on PSO PID controller, BFO PID controller and roll rate inner loop damper is designed. It is observe that the proposed autopilot system quickly and accurately responds to the roll command from the pilot while maneuvering an aircraft. Here the aircraft is a four engine jet aircraft named as Charlie aircraft is taken into consideration.
    Keywords: Roll Attitude Control System; PID controller; Bacterial Foraging Optimization; Particle Swarm Optimization.

  • Design of Image Enhancement Filters Using a Novel Parallel Particle Swarm Optimization Algorithm   Order a copy of this article
    by Geraldine Amali, Siddhartha Bhuyan, Aju D 
    Abstract: Designing image enhancement filters with arbitrary frequency response subject to stability constraints is a complex multidimensional optimization problem. In this paper a novel Parallel Particle Swarm Optimization algorithm (PPSO) algorithm is proposed and applied to the design of Infinite Impulse Response image filters. The proposed PPSO is divided into two phases. In the first phase the particle swarm in the classical PSO algorithm is divided into subpopulations that evolve on separate cores of a multi-core machine. Best solutions from each sub population are then interchanged between cores. In the second phase a local search using Nelder Mead Simplex is done to refine the solution. Classical PSO is used for global exploration to explore multiple local minima whereas Nelder Mead helps refine the solution computed by the PSO. The proposed PPSO outperformed the other global optimization algorithms in terms of the mean square error between the ideal and designed filter frequency responses and CPU utilization.
    Keywords: Particle Swarm optimization; Parallel Particle Swarm Optimization; Infinite Impulse Response filter design; Genetic Algorithm; image enhancement; Nonlinear Global Optimization; Nelder-Mead simplex search;.

  • Cross region load balancing of tasks using region based rerouting of loads in cloud computing environment   Order a copy of this article
    by Kaushik Sekaran, Venkata Krishna P 
    Abstract: All the sudden, cloud computing conquered the whole world of internet with its enormous computing power with its wonderful servicing to the cloud clients. Cloud computing has many unique features and many useful services. One of the think beyond technique is load balancing of tasks which helps the cloud servers in achieving minimal time delay in delivering the services. Cross region load balancing of tasks is the heart of the cloud computing in the real-time computing as it solves some of the greater geographical internet services issues between many countries around the world. In this paper we propose a novel algorithm to minimalize the loads across various cloud servers through proper analysis of region based load balancing of tasks. The result investigated in our paper undoubtedly bounces finest solutions for the betterment to the new era in the upcoming cloud computing issues. Our approach obviously signifies the accomplishment of ideal load balancing of tasks in several hybrid regions with respect to minimal latency and high throughput.
    Keywords: Cloud computing; load balancing; cloud servers; cross region; geographical internet services; minimal latency; throughput.

  • Sound Model for Dialogue Profiling   Order a copy of this article
    by Daniela López De Luise, Ruben Azor 
    Abstract: Since Zipfian behavior has been defined as a way to predict word distributions in written texts for any Language, there is an open door for practical applications for understanding how the brain works producing sentences and dialogs in a very efficient and robust way. Currently many applications make profit of advances in understanding linguistics and psycholinguistics. But these advances are no balanced with speech and utterances modeling. This paper presents a fractal model for oral productions in Spanish. Results analysis show that sound tracks from a sound library with Spanish voices can be assimilated to a specific fractal distribution Disregarding which is the speaker the trend keeps valid for every sample.
    Keywords: natural language processing; dialogue processing; fractals; zipfian behaviour; language metrics; dragon fractals; linguistic reasoning; reasoning modelling.; language production; speech processing.

  • Comparing Product Features of Motor Cycles - A Multi-group Analysis   Order a copy of this article
    by Alagirisamy Kamatchi Subbiah Sukumaran 
    Abstract: The study compares the product features of motor cycles based on the preferences of the consumers. Inman (2001) felt the need for creating an inventory of features which can be used to elicit the choice of the consumers. The literature dealing with satisfaction from product features are mostly based on social and demographic characteristics. There are not many studies which analyze the product-features themselves based on the choice of the consumers. This study is unique to attempt multi-group analysis in addition to regression, for the purpose of identifying the unique product features of two leading motor cycle brands. The results of the study will be useful to the two wheeler manufacturers and marketers to identify the distinctive product features of the motor cycles, formulation of marketing strategy and, in the development of new models of motor cycles.
    Keywords: motor cycles; product features; multi-group analysis.

  • Hybrid Framework using Data Mining Techniques for Early Detection and Prevention of Oral Cancer   Order a copy of this article
    by Neha Sharma, Hari Om 
    Abstract: This paper presents the usage of classification and association data mining techniques for early detection and prevention of oral cancer. The indigenous dataset of 1025 patients who visited a tertiary care centre during 2004 to 2009 was used for the research. Ten classification data mining models are designed using varied types of data mining techniques like regression analysis, classification trees and neural networks. Regression analysis models are linear regression model and logistic regression model; classification tree models are decision tree model, decision tree forest model and TreeBoost model and artificial neural networks are Multilayer Perceptron model, Radial Basis Function model, Group Method of Data Handling model, Cascade Correlation model and Probabilistic General Regression Neural Network model. Association rules are generated using apriori algorithm. The classification models and association rules are evaluated using estimation parameters. Finally, a hybrid oral cancer management system is proposed using the classification model that performs the best and the association rules that contributes towards the construction of knowledge base.
    Keywords: Oral Cancer; Regression analysis; linear regression; logistic regression; decision tree; decision tree forest; TreeBoost; artificial neural networks; Multilayer Perceptron; Radial Basis Function; Group Method of Data Handling; Cascade Correlation; Probabilistic – General Regression Neural Network; Apriori; Association Rule Mining.
    DOI: 10.1504/IJAIP.2019.10004660
  • Natural Language Processing for Hybrid Knowledge Representation   Order a copy of this article
    by Poonam Tanwar, T.V. Prasad, Kamlesh Dutta 
    Abstract: The excessive amount of knowledge increased the demand of tools for Organizing, Processing and Extracting the Knowledge. To organize different types of the Knowledge and to infer the Knowledge intelligently is the biggest task in Knowledge Engineering (KE), Natural Language Processing (NLP), Information Retrieval (IR) and Knowledge Management. The key factor for the development of this Society/Nation is the interactive access to information in Current Era. The source of information available in todays life is benefiting the people those who are familiar with the English language, but the biggest question is what about the generation who are not familiar with English not only in India but aboard too. One solution to this problem is graphical visualization, easy and fast access to the information available through any source. This paper present the system that provides the user friendly interface for all the users (very good English, good English, not too good in English) for knowledge gathering, discovery and retrieval.
    Keywords: Knowledge Representation (KR); Natural Language Processing (NLP); Semantic net; Script; Classification.

  • Marker-Based Augmented Reality Interface with Gesture Interaction to Access Remote File System   Order a copy of this article
    by Shriram K Vasudevan, Naveen T, Padminy KV, Shruthi Krithika J, Geethan P 
    Abstract: Abstract: Augmented Reality is a technology which enriches the real world with digital information. An Augmented Reality (AR) Interface superimposes digital objects or interactive computer graphics, on to the real world dynamically. Since its introduction, the technology has been capable of presenting possibilities that have been challenging for other technologies to offer and meet. Nevertheless, AR environments has been largely limited to simple browsing or simply viewing of virtual information registered to the real world. In a few more years, AR will definitely change the way individuals view the world. In this paper, we have designed a system where a remote files and directories are augmented in real-time over the camera view of the smartphone, tablet or PC. The users can access the remote file system and perform operations using gestures. This system provides a smooth and continuous interaction between the user and the digital space by only using hand gestures, without the use of any special purpose devices like a mouse or a joystick.
    Keywords: Marker-Based Augmented Reality; AR Interface; Leap Motion Controller; Gesture Interaction; Remote System Access.

  • An Extension of the Ontology Web Language with Multi-Viewpoints and Probabilistic Reasoning   Order a copy of this article
    by Mounir Hemam 
    Abstract: A real world entity is unique but it can have several representation, due to various interests or perspectives. In this paper, we are interested in the problem of multi-representation in ontology. We believe that the most appropriate way is to use viewpoint notion in order to build an ontology called multi-viewpoints ontology. This type of ontology confers to the same universe of discourse, several partial descriptions, where each one is relative to a particular viewpoint. Moreover, these partial descriptions share at global level, probabilistic ontological elements allowing the representation of uncertain knowledge between the various viewpoints. The treatment of this kind of information requires new approaches for knowledge representation and reasoning on the web as existing Semantic Web languages are based on classical logic which is known to be inadequate for representing uncertainty. So, our goal is to propose an ontology web language, which extend OWL language with viewpoint and probabilistic uncertainty, to allow multi-viewpoints and probabilistic reasoning with OWL ontologies.
    Keywords: knowledge engineering; ontology; semantic web; multiple viewpoints; probabilistic reasoning.
    DOI: 10.1504/IJAIP.2018.10003857
  • Dynamic vs Static agent ordering in Distributed Arc Consistency   Order a copy of this article
    by Saida Hammoujan, Imade Benelallam, El Houssine Bouyakhf 
    Abstract: Recently, many approaches were proposed for solving Distributed Constraint Satisfaction Problems DisCSPs. One of these approaches, we cite Asynchronous Maintenance of Arc Consistency AMAC that it has proven to be an efficient algorithm. AMAC algorithm performs an asynchronous arc consistency process during sequential search. In this paper, we propose two new approaches based on AMAC. However, instead of using a lexicographic ordering as a static agent/variable ordering, we present two asynchronous algorithms that exploit the structure of the DisCSPs by the use of powerful agent/variable ordering heuristics and enforce arc-consistency during resolution. The first algorithm we propose, AMAC_DO, uses Dynamic variable ordering heuristics, that are very useful in centralized CSPs. The second algorithm, ILAAC, is based on the split of the problem into several sub-problems using the pseudo-tree structure of the constraint graph. We offer an analysis and interpretation of experimental evaluation of the proposed approach. The experimental results show clearly the usefulness of arc consistency process combined with variable ordering heuristics for random problems in terms of communication cost and computation effort.
    Keywords: Distributed Constraint Satisfaction Problems; Arc Consistency; Variable Ordering Heuristics; Pseudo-tree.

  • An Effective e-Learning system through learners' scaffolding   Order a copy of this article
    by Suman Bhattacharya, Sankhayan Chowdhury, Samir Roy 
    Abstract: Scaffolding is an age-old technique of teachers intervention to augment and quickens the learners learning process. While a human teacher has the scope to interact with the learner in contact mode and apply his intelligence to assess the need of such effort, an e-learning system does not have this capacity. This paper presents a scaffolding system for an e-learner. It is targeted towards school level children. The system is structured around the concept of finite state machine to model the cognitive state of the learner. Learning experiences are also taken into consideration. The system is tested on a large number of school going children. Experimental results indicate that under this system, the students achieve their learning objectives to a greater extent with better experience.
    Keywords: Intelligent Tutoring Systems; Interactive Learning Environments; Pedagogical issues; Teaching/Learning strategies; scaffolding; e-learning; finite state machine; learning experience.

  • An empirical study of feature selection for classification using genetic algorithm   Order a copy of this article
    by Saptarsi Goswami, Amlan Chakrabarti, Basabi Chakraborty 
    Abstract: Feature selection is one of the most important preprocessing steps for a data mining, pattern recognition or machine learning problem. Finding an optimal subset of features among all the possible feature subsets is a NP-Complete problem. Use of evolutionary algorithms to tackle the above kind of problems is one of the approaches. Genetic algorithm (GA) is one of the variants of evolutionary processes based on selection, mutation and reproduction. The selection process is based on survival of the fittest principle. An optimal feature subset should be one having highest association with target variable and lower inter feature association. As per literature study, most of the approaches combine the above objectives in a single numeric measure. In this paper, in contrast the problem of finding optimal feature subset has been formulated as a multi objective problem. The concept of redundancy is refined with a concept of threshold value. An objective to maximize entropy of individual attributes has been added in one of the multi objective experiment setups. Experiments on thirty-three publicly available datasets have been conducted with 3 multi-objective and 2 single objective settings. Analysis of results reveals better classification accuracy in the multi objective methods as compared to the single objective methods. A 12% improvement in classification accuracy can be observed on an average. It is shown to further improve by 2-3%, after refining the concept of redundancy (mIRMR) using probabilistic threshholding and then by addition of maximizing entropy (mIRMRE) as an objective. The performance improvement is statistical significant as found by pair wise t-test and Friedmans test.
    Keywords: Feature Selection; Classification; Genetic Algorithm (GA); Multi-objective; Filter.

  • Verification on Factors of Information Technology Acceptance for Construction Users based on the Davis's Technology Acceptance Model : Focused on the Application Case of IT in Construction   Order a copy of this article
    by Eun Soo Park, Tai Sik Lee, Min Seo Park 
    Abstract: Since the information wave of the 1990s, various changes, including improved business processes and management styles, have emerged as different fields attempt to create more profits through increased productivity. And, it is clear that the use of IT in the construction industry will spread with the changing paradigm, and the extent of IT application will increase rapidly as well. Thus, a study showing the extent of users acceptance of IT within the context of the construction industry needs to be conducted. Based on this perception, we conducted actual research on IT acceptance by individuals using it in the construction industry based on Daviss (1989) technology acceptance model (TAM). To introduce Daviss model into the construction IT model, we hypothesized whether each internal and external construction IT factor would influence information accepters. Throughout the survey, based on the construction IT model and the statistical analysis of the survey, we observed the accepters perceived level of usefulness and ease of use before the introduction of new ITs. Finally, we aim to grasp how the IT being utilized in the construction industry impacts users and to determine whether the latest IT should be introduced by analyzing whether the users are able to accept it.
    Keywords: Information Technology Acceptance; Construction Users; Verification; Davis; Technology Acceptance Model.

  • Secure Minimum Loss Route Selection of MIMO based MANET in combined (Indoor, Outdoor, and Forest) Terrain   Order a copy of this article
    by Swati Chowdhuri, Pranab Banerjee, Sheli Sinha Chaudhury, Nilanjan Dey, Arun Mandal, V. Santhi 
    Abstract: Multiple-input multiple-output (MIMO) is a very promising technique in modern wireless communication systems, which can meet the demand of high data rate with limited bandwidth. Integration of MIMO technology with mobile ad hoc network can improve the performance of transmission process in hazardous environment. In order to design a real MIMO wireless system and predict its performance under certain circumstances, it is necessary to have accurate MIMO wireless channel models for different scenarios. In general, mobile ad hoc network with multiple antennas suffers with scattering effects. In this work, a combination of two ring model and random scattering model is discussed to evaluate channel impulse response of the network. Similarly, channel impulse response or channel matrix is used to estimate the propagation loss of the MIMO based mobile ad hoc network in different terrain. Finally, a minimum loss secure path selection is carried out by proposed PASR (Path loss based Administrator selection Secure Routing Protocol) protocol. The efficiency of the proposed protocol is verified through obtained results.
    Keywords: Mobile ad hoc network (MANET); Multi Input Multi Output (MIMO); Impulse Response; Propagation loss; Routing;.

  • Automated Lumbar-lordosis Angle Computation from Digital X-ray Image Based on Unsupervised Learning   Order a copy of this article
    by Raka Kundu, Amlan Chakrabarti, Prasanna Lenka 
    Abstract: Computation of lumbar-lordosis angle (LLA) of spine is a common measure for patients suffering from lower back pain (LBP). LLA is one of the measures for proper monitoring of patients suffering from spine problem. The angle formed between the extreme superior lumbar vertebra (L1) and the superior sacrum vertebra (S1) is the LLA. Based on Gaussian Mixture Model (GMM), an unsupervised method, an automated image processing technique was developed for computation of LLA from spine sagittal X-ray image where lumbar-sacral curvature was identified and the curvature angle (Cobbs method) was measured to get the LLA. Determination of LLA is one of the major parameter that carries importance in finding out the credible etiology of LBP syndromes. Objective of our proposed automated technique is to ease real-life issues in medical treatment, act as a primitive investigation in patients with suspected LBP syndromes and to assess the severity of the disease. To the extent of our knowledge the proposed technique for automated LLA angle computation from digital X-ray is first of its kind. Validation of the technique was done on 22 X-ray images and promising results were achieved from the performed experiments.
    Keywords: Automated computer-aided detection and diagnosis; lumbar-lordosis Cobb’s angle (LLA); Digital X-ray image; Gaussian mixture model; expectation maximization; lumbar-lordosis (LL).

  • A Simulation of Model Selection Strategy in Hierarchical System Using the Analytic Hierarchy Process   Order a copy of this article
    by Gabsi Mounir, Rekik Ali 
    Abstract: Existing literature has recognized that superior organizational capabilities, primarily stemming from knowledge Integration, bring firms strong strategic outcomes. This article argues that organizational capability can be used as a bridge to explain relationships involving strategy and knowledge .There are many kinds of management information systems (MISs) whether in industries or in enterprises, A model selection strategies focal flow management resources can help a firm achieve its strategic goals, and further, to align its knowledge management with its strategies.; develop synthesis of coordination management strategies of resource flows to aggregate distributed hierarchical system topics. In this paper we proposed a simulation of model selection strategy of work travels company using the method of analysis hierarchies
    Keywords: Strategy; Aggregation of hierarchy; synthesis strategies; Coordination strategies; AHP; aggregation; Decision making; Goals.

  • A New Fractal Watermarking Method for Images of Text   Order a copy of this article
    by Kourosh Kiani, Arash Mousavi, Shahaboddin Shamshirband 
    Abstract: A new method, using the orthogonal fractal coding is developed for fractal watermarking of high contrast, low-density images of texts. In this method, image is divided into one pixel height sub-images. Each sub-image is coded separately using the orthogonal fractal coding technique. A binary watermark is re-ordered using a chaotic sequence. The binary watermark is inserted into the range block means of fractal codes. This fractal code is further decoded to obtain the watermarked image. The watermark sequence is retrieved by comparing the original image and the watermarked code. The extracted watermark is re-ordered using the key of the chaotic sequence. The method is robust against JPEG and noise attacks and has a very low watermark visibility.
    Keywords: fractal watermarking; high contrast image; text watermarking; steganography.

  • A Clustering Based Recommendation Engine for Restaurants   Order a copy of this article
    by Aarti Singh, Anu Sharma 
    Abstract: With the wide spread of tourism industry, restaurant recommendation systems have become an important application area for any Recommendation Systems (RS). Designing an efficient and scalable solution for restaurant recommendation is still an open area of research. Many researchers have contributed to the idea of generating recommendation systems for restaurants. But none of these approaches used clustering of user profile database to reduce the search space before applying Recommendation Techniques (RT). The aim of this research is to provide a more scalable solution for recommending restaurants. This work applies existing RT on reduced rating data obtained by clustering of user profiles. Results suggested that there is considerable decrease in the processing time while maintaining the accuracy of the recommendation.
    Keywords: clustering; k-means; recommendation techniques; user profiling; restaurant recommendation.

  • A hybrid technique for de-noising multi-modality medical images by employing Cuckoos Search with Curvelet transform   Order a copy of this article
    by Muhammad Arif, Manzoor Elahi 
    Abstract: Medical image de-noising is a difficult task. Employing an efficient de-noising and contrast enhancement technique is considered as an important step to improve the overall visual representation of clinical images and provide good and recovered diagnosis results. Numerous image de-noising and contrast enhancement approaches are developed to solve this problem efficiently. However, some of them are failing in providing good accuracy and efficiency. In this study, we de-noise medical images without loss of information. We use curvelet and ridgelet transforms with cuckoo search (CS). A multi-scale directional transform known as curvelet transform can adapt to and represent sparse pixel information with edges. This transform gives an advantage to image de-noising techniques, which application relies on the source information of local images. Edges, which play a vital role in the understanding of images, are better represented by curvelet transforms than wavelets do. We optimize the de-noised and enhanced coefficients using the evolutionary algorithm, CS, without loss of structural and morphological information. CS maintains the coefficient in an image at the same position after de-noising. Our research focuses on devising a method that would be reliable and perform accurate and efficient de-noising. Our method attempts to remove additive and multiplicative noises. The proposed method is proved to be an efficient de-noising technique in removing noise from medical images. The sequence of coefficients generated by curvelet and ridgelet transforms is determined using CS, which is an optimization algorithm. Result indicates that our proposed approach is better than other approaches in removing impulse, Gaussian, and speckle noises.
    Keywords: Medical images; Curvelet transform; Cuckoo search; Noise; Optimization.; de-noising.

  • Word Sense Based Approach for Hindi to Tamil Machine Translation Using English as Pivot Language   Order a copy of this article
    by Vimal Kumar K, Divakar Yadav 
    Abstract: Machine translation is defined as the translation of source text to a desired target text. There is a great need of machine translation system as there is globalization in every field in this current internet world. As there is resource availability in different languages in the internet world, there is need to share the knowledge to a different set of audience who knows only their native language. This proposed system is aimed to build a word sense based statistical machine translation system for translating Hindi to Tamil language. Since there is a lack of resources in these languages, there is need of some other intermediate pivot language which has high resource availability. In this proposed system, English has been identified as a pivot language due to its rich resource availability. Initially, the Hindi text is subjected to preprocessing phase where the text is morphologically and syntactically analyzed. Based on the analysis, the senses of the words are identified using Latent Semantic Analysis (LSA) in order to provide a meaningful translation. Once these analysis are done, the sentence is subjected to statistical translation from source to pivot and then from pivot to target language. This system has an improved efficiency when compared with the system that doesnt have sense identification and pivot language.
    Keywords: Statistical Machine Translation; Word Sense Disambiguation; Latent Semantic Analysis; Pivot based Machine Translation.

  • Quality Factor Optimization of Spiral Inductor using Firefly Algorithm and its Application in Amplifier   Order a copy of this article
    by Ram Kumar, Fazal.A Talukdar, Nilanjan Dey, Valentina E. Balas 
    Abstract: This proposal details an optimized design of a CMOS Spiral Inductor for output matching circuit of Low Noise Amplifier by employing nature inspired intelligence based technique called Firefly Optimization Algorithm (FA). Optimization of these parameters has been carried out by considering single objective function. Penalty factor method is considered for handling the constraints. Using FA technique, the inductor with a high-quality factor of 5.87 is obtained at 5.5 GHz frequency in Matlab environment. A computer aided design tool ASITIC is used for the validation. The output matching circuit of Low Noise Amplifier is designed using Pi model obtained from ASITIC. The designed LNA has a cascode structure with inductive source degeneration topology and is implemented in UMC 0.18 μm CMOS technology using CADENCE software. The designed LNA has a simulated value at 5.5 GHz frequency.
    Keywords: Spiral Inductor; Optimization Technique; Particle swarm optimization; Firefly optimization; Low noise amplifier; Quality factor; ASITIC.

  • Ant_VRP: Ant-Colony based Meta-heuristic Algorithm to Solve the Vehicle Routing Problem   Order a copy of this article
    by Majid Nikougoftar Nategh, Ali Asghar Rahmani Hosseinabadi, Valentina Emilia Balas 
    Abstract: Vehicle routing problem is one of the most important combinatorial optimization problems and is very important for researchers and scientists today. In this kind of problems, the aim is to determine the minimum cost needed to move the vehicles, which start simultaneously from the warehouse and returned to it after visiting customers. There are two constraints for costumers and vehicles, first, each node must be visited by only one vehicle and second, each vehicle mustnt load more than its capacity. In this paper, a combination of Ant Colony Algorithm and mutation operation named Ant_VRP is proposed to solve the vehicle routing problem. The performance of the algorithm is demonstrated by comparing with other heuristic and meta-heuristic approaches.
    Keywords: Vehicle Routing Problem; Optimization; Ant Colony Algorithm; Mutation.

  • Context Aware Power Management in Smart Grids Using Load Balancing Approach   Order a copy of this article
    by RajaSekhar Reddy NV, Venkata Krishna P 
    Abstract: Smart grid is an electrical grid with an advanced digital communication network. In the past decade, the development in the field of smart girds attracted more researchers towards it. This paper presents the power aware model for context awareness and load management in the smart homes. In the adaptive system of the smart grid, the smart homes are treated as smart node. The proposed power aware smart home management model (PASH) incorporates the evolutionary programming algorithm and context awareness rules for communicating with the users. The PASH model explains the advantage of load balancing in the smart homes for both users and smart grid. The context management module in the PASH helps to utilize the power efficiency in the peak demands.
    Keywords: Power management; load balancing; context; smart homes; smart grids; evolutionary programming.

  • Methodology of Wavelet Analysis in Research of Dynamics of Phishing Attacks   Order a copy of this article
    by Mehdi Dadkhah, Vyacheslav V. Lyashenko, Zhanna V. Deineko, Shahaboddin Shamshirband, Mohammad Davarpanah Jazi 
    Abstract: Safety of transfer and reception of various data over the Internet can be accompanied by a presence of harmful components in a passed content. The phishing attack is one of versions of such harmful components. Thus it is important to know the relationship between the Phishes Verified as Valid and Suspected Phishes Submitted. This is necessary for the forecast. To solve this problem, we use the wavelet analysis of time series which represent Phishes Verified as Valid and Suspected Phishes Submitted. We are considering the change of Hurst indicator; we analyze of a spectrum of wavelet energy. This allows you to identify the features of the main characteristics of time series which are considered. Conducted researches have shown the presence of essential duration of long-term dependence of investigated data. We also identified presence of trend component in structure of investigated series of data. It allows you to investigate recurrence of occurrence of phishing attacks that allows to concentrate forces and means during the periods of activization of such harmful influences. The analysis is spent on real data that reflects the importance of the conclusions obtained.
    Keywords: Internet; phishing; trend; wavelet analysis; wavelet energy; wavelet expansion; Hurst indicator; Daubechies Wavelet.

  • Real Time Navigation of a Mobile Robot   Order a copy of this article
    by P. Raja, Ambati Akshay, Akshay Kumar Budumuru 
    Abstract: Path planning is one of the major challenges in navigation of a robot. In case a robot is unable to decide the direction in which it has to move with the information available, the decision can be made with the help of external source. Most of the algorithms involve computations, which may sometime require lot of memory and time. Our work deals with avoiding those calculations by sending direct visual feed to the control room where the environment can be analyzed, which enables the robot move from source to the goal position. The robot also displays the coordinates on the console screen which can be used for creating maps. During our work we observed that by decreasing the time interval for refreshing the coordinates the closest distance the robot can go towards the obstacle also decreases thus allowing the robot to move through very narrow paths it can fit.
    Keywords: Mobile robot; navigation; mapping; control room.

  • Energy and Velocity based Multipath Routing Protocol for VANET   Order a copy of this article
    by Bhagyavathi Miriam, Saritha Vankadara 
    Abstract: The VANET is a type of network that can be built randomly, quickly and temporarily without any standard infrastructure. In VANET, routing of data is an interesting and challenging task because of the high mobility. Therefore, the routing algorithm for VANETs is an imperative issue, particularly in vehicle to vehicle communication. This paper proposes a multipath routing algorithm for VANET named Energy and Velocity based Multipath Routing Protocol (EVMRP) based on available bandwidth, residual energy and relative velocity. The most important point of the proposed algorithm is setting the CWmax as the available bandwidth of the path. The proposed algorithm is tested on the QoS parameters like end-to-end delay, throughput and packet loss. The results clearly indicate that the proposed algorithm, EVMRP outperforms when compared to the legacy systems like AOMDV.
    Keywords: Routing; VANET; available bandwidth; Multipath.

  • Mutation Based Genetic Algorithm for Efficiency Optimization of Unit Testing   Order a copy of this article
    by Rijwan Khan, Mohd. Amjad 
    Abstract: Fault in a software program can be detected by mutation testing . However, mu-rntation testing is an expensive process in a software testing domain. In this paper, wernhave introduced a method based on Genetic Algorithm and Mutation Analysis for unitrntesting process. Software industry produces high quality software in which softwarerntesting has an important role. First, we make a program/software and intent somernmutant in this program/software, nd most critical path and optimize test cases usingrngenetic algorithm for the unit testing. Initially generated test cases are re ned usingrngenetic algorithm. We use a mutant function for measuring the adequacy of the testrncase set. The given mutant function is used to calculate a mutant score. We havernachieved 100% path coverage and boundary coverage using mutation testing. The ob-rnjective is to produce a set of good test cases for killing one or more undesired mutantsrnand produces di erent mutant from original software / program. Unlike simple algo-rnrithms, Genetic Algorithms provide suitability for reducing the data generation at arncomparable cost. An optimized test cases has been generated by proposed approachrnfor cost reduction and revealing or killing undesired test cases .
    Keywords: Genetic Algorithms (GA); Software Testing (ST); Automatic Test Case Coverage (ATCC),rnBoundary Value Analysis (BVA); Mutation Testing (MT).

  • University-timetabling problem and its solution using GELS algorithm: A Case Study   Order a copy of this article
    by Majid Nikougoftar Nategh, Ali Asghar Rahmani Hosseinabadi, Valentina Emilia Balas 
    Abstract: Course scheduling includes a large volume of data with numerous constraints and unchangeable specifications and each university deals with several times a year. Course scheduling is a NP-Hard problem and using traditional methods to solve it is very difficult. But evolutionary algorithms suggest good solutions for this type of problems. In this paper we used Gravitational Emulation Local Search Algorithm to solve the course scheduling problem which is an evolutionary algorithm. Results demonstrate the good quality of time table provided by proposed algorithm and also decreased time against other algorithms.
    Keywords: Course scheduling; GELS Algorithm; Genetic Algorithm.

  • AMST-MAC: Adaptive Sleeping Multi-Frames Selective Data Transmission Control for Wireless Sensor Networks   Order a copy of this article
    by Bindiya Jain, Gursewak Brar, Jyoteesh Malhotra 
    Abstract: Energy efficiency is the major issue in the designing of wireless sensor networks. Keeping in view the importance of energy efficiency in wireless sensor networks, designing the efficient MAC protocol is of paramount importance which intends to make them energy efficient. The proposed MAC protocol called AMST- MAC (Adaptive sleeping multi frames selective data transmission MAC) is an energy saving mechanism whose objective is to remove the redundancy, reduce the number of packets sent for same amount of information by using SDT It allows the node to sleep for the time when it is idle even in the data cycle using the concept of DDC. The aim of this simulation study was to evaluate the reliability of the proposed protocol in terms of energy efficiency, end-to-end delay and packet delivery ratio compared to SMAC protocol without degrading service quality. The results obtained clearly shows that the proposed AMST-MAC protocol is more energy efficient in comparison to SMAC protocol and it maintains the lowest sender and receiver duty cycles. AMST MAC decreases the delay by a factor of 10% thus overall mean delay will show a reasonable decrease .AMST-MAC protocol consumes less energy in every round enabling AMST-MAC to be a better protocol as compare to S-MAC protocol without SDT & with SDT.
    Keywords: Sensor networks; Medium Access Control; Energy Efficient; AMST-MAC Protocol; Selective data Transmission; Dynamic duty cycle.

  • An Intelligent Clustering Approach for Improving Search Result of a Website   Order a copy of this article
    by Shashi Mehrotra, Shruti Kohli, Aditi Sharan 
    Abstract: These days internet has become part of our life, and thus web data usage is increased tremendously. We proposed a model that will improve the search result using clustering approach. Clustering is being used to group the data into the relevant folder so that accessing of information will be fast. The K-Means clustering algorithm is very efficient in terms of speed and suitable for large data set. However, K-Means algorithm has some drawbacks, such as a number of clusters need to be defined in the starting itself, initialization affects the output, and it often gets stuck to local optima. We proposed a hybrid model that determines the number of clusters itself and gives global optimal result. The number which has been obtained is passed as a parameter for The K-Means. Thus, our novel hybrid model integrates the features of K-means and Genetic algorithm. The model will have best characteristics of K-Means and genetic algorithm, and overcomes the drawbacks of K-Means and genetic algorithm.
    Keywords: Clustering; K-Means algorithm; Genetic algorithm; Hybrid algorithm.

  • A Clustered Neighborhood Consensus Algorithm for a Generic Agent Interaction Protocol   Order a copy of this article
    by Aarti Singh, Dimple Juneja, Rashmi Singh, Saurabh Mukherhjee 
    Abstract: The premise of the paper is twin fold. It not only improves the existing generic agent interaction protocol (GIPMAS) but also uniquely addresses the issue of generating consensus amongst agents participating in generic agent interaction protocol. In a multi-agent system, agents cooperate and coordinate to reach to decision while sending the information. Now, in a clustered multiagent system, all member agents of the given cluster send the data to cluster head which then forwards the processed information to next level for further processing. It is quite apparent that agents in close proximity (belonging to same or different clusters) would transmit the redundant information. Hence, it is desired that before sending the raw data, member agents should mutually agree on common decision (based on some common metrics) and send in the only relevant and agreed upon information to next higher level. The paper significantly contributes a consensus algorithm which is marriage of neighborhood algorithm and discrete time consensus protocol. The proposed neighborhood algorithm focuses on providing more weight to communication links/edges joining two clusters as compared to links joining two agents in a cluster, which increases rate of convergence of information. Thus in clustered network of agents, cluster head and executive cluster head would be responsible for deriving consensus in the received information. Simulation reflects that proposed mechanism improves time of convergence of information. However, slight increase in task execution is also observed due to trade off between quality of output and complexity of mechanism.
    Keywords: Multiagent Systems; Agent Interaction Protocol; Clustered Network; Neighborhood Algortihm.

  • Evolutionary Optimization Based Fractional Order controller for Web Transport Systems in Process Industries   Order a copy of this article
    by Haripriya N., Kavitha Paneerselvam, Seshadhri Srinivasan, Juri Belikov 
    Abstract: This investigation presents an optimization based design of fractional order proportional integral (FO-PI) controller for web transport systems used in paper industries. The objective of the optimization algorithm is to reduce the integral absolute error of the closed loop web transport systems considering the underlying physical and operating constraints. The resulting optimization problem is non-linear, and to compute the controller parameters, evolutionary algorithms- particle swarm optimization (PSO) and bacterial foraging optimization (BFO) are used. The performance improvements achieved usingFOCis compared with traditional proportional integral derivative controller. Our results show that BFO tuned FOC shows better performance
    Keywords: Web Transport Systems (WTS); Web Transport Controllers (WTC); Fractional Order Controllers (FOC); Particle Swarm Optimization (PSO); Bacterial Foraging Optimization (BFO); Offline optimization.
    DOI: 10.1504/IJAIP.2018.10006748
  • Protagonist and deuteragonist based video indexing and retrieval system for movie and video song sequences   Order a copy of this article
    by Tushar Ratanpara, Narendra Patel 
    Abstract: Protagonist and deuteragonist are two main characters that plays leading role in Indian Hindi movie (IHM). Currently such information is attached using textual caption which is highly unreliable. The research presented in this paper is to automatically index and retrieve the content based on protagonist and deuteragonist from the large IHM and video song sequences (VSS). Video song sequence indexes are extracted using audio based approach from IHM in module 1. These indexes are used as an input in module 2. Faces are identified from every VSS. Colour histogram and spatiogram descriptor are extracted from faces. Similarity between two faces is computed using Bhattacharyya coefficient. Similarity based clustering technique is performed to obtain clusters of faces. Recognition of protagonist and deuteragonist is done using surf feature points. The experimental results are carried out using Indian Hindi movies of different genres.
    Keywords: Content based video indexing and retrieval; song sequences; clustering; similarity; color histogram; Spatiogram.

  • An Improved Key Management Scheme in Cloud Storage   Order a copy of this article
    by VijayaKumar V, Abdul Quadir, Kiran Mary Matthew 
    Abstract: Nowadays, cloud services are used by numerous people all around the globe. One of its major applications is in the field of cloud storage. Users can store data in cloud without the need to have hardware resources for their storage. They will just have to pay for the amount of resources they use. For storage applications, usually the user provides the cloud with the data to be stored. Cloud encrypts this data and returns a key to the user. So the user needs to store only this key for decryption. Storage of this key is a matter of concern. If the key is lost, then the probability of data loss is very high. In order to avoid this, a large number of key management techniques have been proposed. In this paper, a key management scheme is proposed that regenerates the key, in case of its loss, using the attributes of the user.
    Keywords: cloud computing; data privacy; key management;.

  • An Effective System for Video Transmission and Error Recovery Mechanisms in Multimedia Networks   Order a copy of this article
    by Rahamathunnisa Usuff, R. Saravanan 
    Abstract: In this paper an effective system has been proposed for video transmission. This system solves the problems arised due to occurrence of errors in transmitted video and delivers video with required quality of services. The reconstructed video maintains the quality of services at the decoder side. Video dynamics based error concealment algorithm is applied for recovering errors occurred during transmission. The performance of the proposed system is measured by means of simulations using JM reference software.
    Keywords: Error concealment; Video dynamics; Video transmission; Quality of service; Reconstructed Video.

    by Sharmila Banu Kather, B.K. Tripathy 
    Abstract: Data in varied nature and huge quantities are being generated every day. They range from tabulated, structured and semi-structured as well as numerical or categorical in terms of attributes. Data preprocessing presents data in a favourable format to apply analytics algorithm and derive knowledge therein. Data analytics has revolutionized millennial mankind unwinding the knowledge and patterns mined from data. Clustering is an unsupervised learning pattern which has popular algorithms based on distance, density, dimensions and other functions. These algorithms are operational on numerical attributes and special algorithms for data involving categorical features are also reported. In this paper we propose a straight forward way of clustering data involving both numerical and categorical features based on Neighborhood Rough Sets. It does not include calculation of any extra parameters like entropy, saliency, dependency or call for discretization of data. Hence its complexity is lesser than algorithms proposed for categorical or mixed data and offers better efficiency.
    Keywords: clustering; mixed; categorical and numerical data; continuous data; rough sets; neighborhood rough sets; granulation.

  • A Unified Approach for Skin Colour Segmentation Using Generic Bivariate Pearson Mixture Model   Order a copy of this article
    by B.N. Jagadesh, K. Srinivasa Rao, Ch Satyanarayana 
    Abstract: Skin colour segmentation is rapidly growing area of research in computer science for identification and authentication of persons. In this paper, a novel generic bivariate Pearsonian mixture model for skin colour segmentation is proposed. It is observed that the hue and saturation of the colour image better characterize the features of the individual human races. In general, the human race can be characterized in to three categories namely Asian, African and European. The feature of the African skin colour can be modeled by bivariate Pearson type-IIb distribution, the Asian skin colour feature can be modeled by bivariate Pearson type-IIaα distribution and the European skin colour feature can be modeled by bivariate Pearson type-IVa distribution. The combination of all these three races of people in an image can be characterized by a three component mixture model. Deriving the updated equations of the EM-Algorithm of the generic bivariate Pearson mixture model parameters is estimated. The initialization of the model parameters are done through moment method of estimation and K-Means algorithm. The segmentation algorithm is developed using component maximum likelihood under Bayesian frame. The performance of the proposed algorithm is carried by experimentation with random sample of five images collected from our own database and various magazine websites with a combination of three races (Asian, African and European) and computing the segmentation performance metrics such as PRI, GCE and VOI. The efficiency of the proposed model with that of Bivariate GMM is carried through confusion matrix and ROC curves. It is observed that the proposed algorithm outperform the existing algorithms.
    Keywords: Skin colour segmentation; Generic bivariate Pearsonian mixture model; EM-Algorithm; Segmentation performance metrics; Feature Vector.

  • An Intelligent and Interactive AR based Location Identifier for Indoor Navigation   Order a copy of this article
    by Shriram K Vasudevan, Karthik Venkatachalam, Harii Shree, Keerthana Rani, Priya Dharshini 
    Abstract: Augmented Reality (AR) has been in existence for more than five decades, but the techniques and methods for implementing this technology are developing only in the recent past i.e. for the past one decade. We have built an application using AR techniques with Android as base platform. We have combined Global Positioning System (GPS) and Augmented Reality (AR) to build an application for indoor navigation. Even though other applications like Google maps already exist for navigation, our application offers the users with more ease and attractiveness through AR. The data of the surroundings of a particular location is being stored in the form of latitude, longitude and altitude (geo location) in the cloud. When a user visits a location for the first time, the geo location details are entered and subsequently stored in the cloud. Consequently, the next time when the same user visits the location or when a new user visits, the stored information will be displayed about the location. The location details are updated as and when a new location is identified. These location details are displayed in the form of markers through the camera that has been integrated into the application For example, when a new student visits a school or college for cultural fest, even after finding the correct building it becomes a tedious task to locate the correct venue or classroom as the area could be too vast. Whereas, with our app, one would reach the correct venue and the augmented reality feature makes it more interactive and user friendly.
    Keywords: Augmented Reality (AR); Android Application Development; Global Positioning System (GPS); Geo Location; Location; Location Manager; Indoor Navigation; Cloud Computing;.

  • River flow prediction with memory based artificial neural networks: A case study of Dholai river basin   Order a copy of this article
    by Shyama Debbarma, Parthatsarathi Choudhury 
    Abstract: Prediction of hydrologic time series has been one of the most challenging tasks in water resources management due to the non-availability of adequate data. Recently, applications of Artificial Neural Networks (ANNs) have proved quite successful in such situation in various fields. This paper demonstrates the use of memory-based ANNs to predict daily river flows. Two different networks, namely the gamma memory neural network (GMN) and genetic algorithm-gamma memory neural network (GA-GMN) have been chosen. The best network topologies for both the ANN models are achieved with Tanh transfer function and Levenberg-Marquardt learning rule after calibrations with multiple combinations of network parameters. The selected ANN models are then used to predict the daily mean flows of Dholai (Rukmi) river in Assam, India, a sub-basin of the Barak river basin. A comparative study of both networks indicates that the GA-GMN model performed better than the GMN model. The GA-GMN model gave better results for both training and testing dataset with minimum training MSE as 0.018 and minimum testing MSE as 22.97. Hence GA-GMN model is selected as an effective tool for predicting flow features of the Dholai river.
    Keywords: Prediction; gamma memory; genetic algorithm; flow.

  • The Recommender System: A Survey   Order a copy of this article
    by Bushra Alhijawi, Yousef Kilani 
    Abstract: Recommender system is a helpful tool for helping the user in cutting the time needs to find personalized products, documents, friends, places and services. In addition, the recommender system handles the century web problem: information overload. In the same time, many environments or technologies (i.e. cloud, mobile, social network) become popular today and facing the problem of large amount of information. Therefore, the researchers recognize that the recommender system is a suitable solution to this problem in those environments. This paper, reviews the recent research papers that were applied the recommender system in mobile, social network, or cloud environment. We classify these recommender systems into four groups (i.e. mobile recommender system, social recommender system, cloud recommender system and traditional (PC) recommender system) depending on technology or environment that the RS is applied in. This survey presents some compression, advantages and challenges of these types of recommender systems. Also, it will directly support researchers and professionals in their understanding of those types of recommender systems.
    Keywords: Recommender system; Collaborative filtering; Recommendation; Hybrid; Mobile; Cloud; Social; cold-start; Content-based filtering; Demographic-based filtering.

  • Occlusion Detection and Processing using Optical Flow and Particle Filter   Order a copy of this article
    by Wesam Askar, Osama Elmowafy, Anca Ralescu, Aliaa Youssif, Gamal Elnashar 
    Abstract: Object tracking systems continue to be an intensive area of research, for which detection and processing of occlusion is a well-known challenge. This paper proposes a new approach to detection and handling of occlusion based on the integration of two known techniques, optical flow and particle filtering. Results of preliminary experiments show that the proposed method can detect and overcome the occlusion problem successfully during the tracking process.
    Keywords: Video tracking; optical flow; particle filter; occlusion.

  • Stochastic Modeling and Pilot Data Analysis towards Provisioning of Ambulance for Handling Emergency   Order a copy of this article
    by Bidyutbiman Sarkar, Pulak Kundu, Nabendu Chaki 
    Abstract: Emergency Medical Services (EMS) refer to various health care related services. Like several other EMS activities, ambulance site selection for fastest service to improve the chance of saving a human life is indeed a very important activity in developing countries. Besides other factors, uncertainty in demand of ambulance at a particular location depends on the type of casualty, its service time, and its availability from the nearest service point.The relocation model of ambulances in emergency is one of the typical oldest optimization problems. In case of a distributed setup, the complexity of these algorithms increases exponentially with the increase of the number of constraints. In this work we try to find an alternative frame work to reduce EMS time using latest art of the technologies along with other additional EMS services at a reasonable cost using a generalized stochastic Petri Net (GSPN).
    Keywords: Ambulatory Care; Distributed System; ICT; PN; GSPN; EMS.

  • Improving Recommendation quality and performance of Genetic-Based Recommender System   Order a copy of this article
    by Bushra Alhijawi, Yousef Kilani 
    Abstract: The recommender system came to help the user in finding the required itemrnin a short time by filtering the available choices. This paper addresses the problem of recommending items to users by presenting new three genetic-based recommender system (GARS+, GARS + + and HGARS). HGARS is a combination of GARS+ with GARS + +. It is an enhanced version of GARS which is works without the need of using the hybrid model. In the proposed algorithm, the genetic algorithm is used to find the optimal similarity function. This function depending on a liner combination of values and weights. We experimentally prove that HGARS improves the accuracy by 16.1%, the recommendation quality by 17.2% and the performance by 40%.
    Keywords: Collaborative filtering; Recommender System; Genetic Algorithms; Similarity.

    by M. Senthilkumar, P. Ilango 
    Abstract: Task scheduling is important of research in big data and its made in two traditions user level and system level. In user-level issues with scheduling between the service provider and customer. In system level issues in scheduling with resources management in the data center. The drawbacks of various existing methods to increase in power consumption of data centers have become a significant issue. Now the Map Reduce clusters constitute a major piece of the data center for Big Data Applications. Simply the absolute size, high fault-tolerant nature and low utilization levels make them less energy efficient. The complexity of scheduling increases when there is an increase in the size of the task, it becomes very tedious to perform scheduling effectively. The drawback with existing scheduling algorithm generates higher computational cost and less efficient. The multi-objective scheduling with cloud computing makes it difficult to resolve the problem in the case of complex tasks. These are the primary drawbacks of several existing works, which prompt us to manage this research on task scheduling in cloud computing
    Keywords: Firefly algorithm (FA); genetic algorithm (GA); task scheduling; Hadoop; Map Reduce framework.

  • An Interactive and Innovative Application For Hand Rehabilitation Through Virtual Reality   Order a copy of this article
    by Shriram K. Vasudevan, S. Aakash Preethi, Karthik Venkatachalam, Mithula G, Navethra G, Krithika Nagarajan 
    Abstract: Physiotherapy has been very monotonous for patients and they tend to lose interest and motivation in exercising. Introducing games with short term goals in the field of rehabilitation is the best alternative, to maintain patients' motivation. Our research focuses on gamification of Hand Rehabilitation exercises to engage patients wholly in rehab and to maintain their compliance to repeated exercising, for a speedy recovery from hand injuries (wrist, elbow and fingers). This is achieved by integrating Leap Motion Sensor with Unity game development engine. Exercises (as gestures) are recognized and validated by Leap Motion Sensor. Game application for exercises are developed using Unity. Gamification alternative has been implemented by very few in the globe and it has been taken as a challenge in our research. We could successfully design and build an engine which would be interactive and real-time, providing platform for rehabilitation. We have tested the same with patients and received positive feedbacks. We have enabled the user to know the score through GUI
    Keywords: Rehabilitation; Physiotherapy; Gesture; Leap Motion Sensor; Recovery; Virtual Reality;.

  • Discovering Communities for Web Usage Mining Systems   Order a copy of this article
    by Yacine SLIMANI, Abdelouaheb MOUSSAOUI, Yves LECHEVALLIER, Ahlem DRIF 
    Abstract: Discovering the community structure in the field of web usage mining structure has been addressed in many different ways. In this paper, we present a new method for detecting community structure using Markov chains based on the set of frequent motifs. The basic idea is to analyze the occurrence probability of different frequent sequences during different user sessions in order to extract the communities that describe the users behavior. The proposed method is constructed and successfully applied on the web site in the university campus of Farhat AbbAs Setif.
    Keywords: Web usage mining; Community detection; Complex networks; Markov chains; Quality function.

  • Person Re-Identification Using kNN Classifier Based Fusion Approach   Order a copy of this article
    by Poongothai Elango, Andavar Suruliandi 
    Abstract: Re-identification is the process of identifying the same person from images or videos taken from different cameras. Although many methods have been proposed for re-identification, it is still challenging because of unsolved issues like variation in occlusions, viewpoint, pose and illumination changes. The objective of this paper is, to propose a fusion based re-identification method to improve the identification accuracy. To meet the objective, texture and colour features are considered. In addition the proposed method employs Mahalanobis metric based kNN classifier for classification. The performance of proposed method is compared with the existing feature based re-identification methods. CAVIAR, VIPeR, 3DPes, PRID datasets are used for experiment analysis. Results show that the proposed method outperforms the existing methods. Further it is observed that Mahalanobis metric based kNN classifier improves the recognition accuracy in re-identification process.
    Keywords: Person re-identification; Colour features; Texture feature; Feature Fusion.

  • Graph Embedded Discriminant Analysis for the Extraction of Features in Hyperspectral Images   Order a copy of this article
    by Hannah Adebanjo 
    Abstract: In remote sensed hyperspectral imagery (HSI), class discrimination has been a major concern in the process of reducing the dimensionality of hyperspectral images. Local Discriminant Analysis (LDA) is a widely accepted dimensionality reduction (DR) technique in HSI processing. LDA discriminates between classes of interest in order to extract features from the image. However, the drawbacks of its application to HSI is the presence of few labeled samples and its inability to extract equivalent number of features for the classes in the image rnThis paper proposes a new graphical manifold DR algorithm for HSI. The proposed method has two objectives: to maximize class separability using unlabeled samples and preserve the manifold structure of the image. The unlabeled samples are clustered and the labels from the clusters are used in our semi--supervised feature extraction approach. Classification is then performed using Support Vector Machine and Neural Networks. The analysis of the result obtained shows that proposed algorithm can preserve both spatial and spectral property of HSI while reducing the dimension. Moreover, it performs better in comparison with some related state of the art dimensionality reduction methods.rn
    Keywords: feature extraction; graph-based methods; manifold learning; hyperspectral image(HSI).

  • Adaptive Tutoring System based on Fuzzy Logic   Order a copy of this article
    by Makram Souii, Abed Mourad, Ghannem Adnane, Daouas Karim 
    Abstract: In recent years, education method has changed and has become very innovative and modern. In this way, online adaptive learning seems to be a revolutionary competitive method. The advancement of computer and networking technologies is the key to this whole change from the classic education to the modern online adaptive education. The majority of E-learning systems are based on Boolean logic. In fact, the system considers that the learner like or not a course characteristic but the user can prefer gradually this parameter (low, medium, high). To this end, the proposed approach exploits semantic relations between data elements and learners preferences to determine adapted UI components appropriate to learners characteristics based on fuzzy logic. The results of evaluation confirm the efficiency of our technique with an average of more than 77% of precision and recall.
    Keywords: Adaptation ; Adaptive course ; Evaluation; Multi-criteria Decision Making; Intelligent Tutoring System.

  • Technical Analysis based Fuzzy support system for stock market Trading   Order a copy of this article
    by Aviral Sharma, Vishal Bhatnagar, Abhay Bansal 
    Abstract: Technical analysis form an integral part of the life of a stock trader. In econometric analysis, technical analysis is method for predicting the course of prices of security under consideration through the study of past statistics relating to the equity, mostly price and volume. Traders tend to use this type of analysis to take a decision regarding a particular security. Fuzzy logic based systems could be used in developing decision models where the experience of a traders can be incorporated inthe decision model. In this paper, we present a hybrid approach between fuzzy logic and technical analysis. The system generates a signal on direction of movement of the stock. Thus, helping the trader to better understand the underlying behavior of the stock under consideration and take a decision accordingly.
    Keywords: Technical analysis; Commodity Channel index; relative strength index; William %R; ultimate oscillator; Aroon; Fuzzy Logic; Artificial intelligence.

  • Adaptive Savitzky-Golay Filtering and Its Applications   Order a copy of this article
    by Jozsef Dombi, Adrienn Dineva 
    Abstract: Noise reduction is a central issue of the theory and practice of signal processing. The Savitzky-Golay (SG) smoothing and differentiation filter is widely acknowledged as a simple and efficient method for denoising. However only few book on signal processing contain this method. As is well known, the performance of the classical SG-filter depends on the appropriate setting of the windowlength and the polynomial degree, which should match the scale of the signal since, in the case of signals with high rate of change, the performance of the filter may be limited. This paper presents a new adaptive strategy to smooth irregular signals based on the Savitzky-Golay algorithm. The proposed technique ensures high precision noise reduction by iterative multi-round smoothing and correction. In each round the parameters dynamically change due to the results of the previous smoothing. Our study provides additional support for data compression based on optimal resolution of the signal with linear approximation. Here, simulation results validate the applicability of the novel method.
    Keywords: Savitzky-Golay filter; adaptive multi-round smoothing;iterative smoothing and correction;noise removal; data compression.

  • A new hybrid Genetic Algorithm for job shop scheduling problem   Order a copy of this article
    by Marjan Kuchaki Rafsanjani, Milad Riyahi 
    Abstract: Job shop scheduling problem is an NP-Hard problem. This paper proposes a new hybrid genetic algorithm to solve the problem in an appropriate way. In this paper, a new selection criterion to tackle premature convergence problem is introduced. To make full use of the problem itself, a new crossover based on the machines is designed. Furthermore, a new local search is designed which can improve the local search ability of proposed GA. This new approach is run on the some problems and computer simulation shows the effectiveness of the proposed approach.
    Keywords: Job shop scheduling problem (JSSP); Genetic algorithm; Selection operator; Crossover operator; Local search.

  • An Optimized Component Selection Algorithm for Self-Adaptive Software architecture using the Component Repository   Order a copy of this article
    by Mohana Roopa Y., Rama Mohan Reddy A 
    Abstract: Component based software engineering focus on the development and reuse of the component. The component reuse is depending on the storage and retrieves process. The storage and retrieve process is carried by a component repository. This paper presents the component repository model for the developers to achieve good productivity. The component selection from the component repository according the functionality and requirements is a crucial part. This paper proposed an algorithm for optimizing component selection with the functionality constraints like customer size, reliability and performance. The experimental result evaluates the performance of the algorithm and it is proved that the proposed algorithm had better performance in terms of component selection.
    Keywords: component; software system selection; adaptability; functionality.

  • Test Optimization: An Approach Based On Modified Algorithm For Software Network   Order a copy of this article
    by Manju Khari, Prabhat Kumar, Gulshan Shrivastava 
    Abstract: Testing is an indispensable part of the software development life cycle. It is performed to improve the performance, quality, efficiency and reliability of the software network. In this paper, three algorithms are implemented namely, Genetic Algorithm (GA), Cuckoo Search Algorithm (CSA), and Artificial Bee Colony (ABC) algorithm for the purpose of Test Suite Optimization and with the help of results obtained from the implementation of these three algorithms, a novel Hybrid Algorithm will be proposed to enhance the result of optimization. To test a system, suitable test cases are developed but these test cases need to be optimized, as executing all the test cases is a time-consuming process. Testing a system with all possible test cases will increase the time required for testing and will also affect the cost of a product. Thus, it is a good idea to reduce the number of test cases which in turn reduces the testing time andwork of a software tester. Authors focus on optimizing test suites so that only the best test cases need to be executed to test software network. In order to optimize test cases, nature-inspired algorithms are used as they provide the best optimization techniques. The proposed algorithm is implemented and experiments are conducted on various real-time programs to evaluate the efficiency of the proposed approach. Experimental results show that hybrid algorithm generates better/comparable results as compared to the existing state-of-the-art algorithms.
    Keywords: Genetic; Cuckoo Search; Artificial bee; Test suite Optimization; Hybrid algorithm; software network; Test data.

  • Application of Artificial Neural Network (ANN) on deformation and densification behaviour of sintered Fe-C steel under cold upsetting   Order a copy of this article
    by Kandavel Thanjavur Krishnamoorthy, Ashok Kumar T, Vijay D, Aswanth Samraj 
    Abstract: Cold upsetting is one of the densification processes used in P/M materials to achieve the desired density by applying required amount of load. The present work aims to study the deformation and densification characteristics of plain carbon steel (Fe-C) containing various levels of carbon viz. 0.2%, 0.5% and 1% under cold upsetting. Elemental powders of iron (Fe) and graphite (C) were accurately weighed based on the compositions requirement and blended homogeneously using a pot mill. Cylindrical preforms of Fe and Fe-C powders were prepared using 100 T capacity Universal Testing Machine (UTM) by applying suitable axial pressure to get 80% theoretical density of respective alloy steels. The green compacts were sintered using 3.5kW electric muffle furnace and nitrogen gas was purged to prevent oxidation during sintering. The sintered preforms of various compositions of Fe-C were subjected to cold upset. The axial and lateral deformations were calculated from the physical measurements taken from the deformedand non-deformed specimens and the density of the deformed preforms was measured by Archimedes principle. The experimental data were used further to generate the deformation and densification model using Artificial Neural Networks (ANN). It is observed from the experimental results that increasing carbon content improves the deformation and densification properties of iron material as it behaves like a lubricant and increases the binding strength between the grains. As the target value of ANN model approaches unity, it could be concluded that the ANN prediction and experimental values have good agreement with each other. It is also added that ANN can be used as a prediction model on deformation and densification behaviours of any P/M materials.
    Keywords: Artificial Neural Network; Powder metallurgy; Densification; Deformation; True axial stress; Plain carbon steel.

  • A study of Total Technical Life (TTL) for an Aircraft with implementation and suggestions for improvisation.   Order a copy of this article
    by Balachandran A, P.R. Suresh, Shriram K. Vasudevan 
    Abstract: Travel has become more sophisticated and inevitable these days. Aircraft has become one of the best opted ways to reach the target. Not only for civilians, but it is also used by the military for operational purposes. With so much of complicated design, there is a need to havernmore reliable systems and also effective use of service life of the aircraft known as Total Technical Life (TTL). The present system of fixing the TTL for an aircraft is passive method in which, the predicted values are compared with the value which is obtained from sample aircraftrnespecially monitored for this purpose. However, the actual fatigue of each aircraft is different as all the aircraft undergo different way of flying under different conditions at different locations. In order to cater for this unknown parameters, factor of safety is applied and hence safe utilization life is obtained. When the aircraft reaches the safe life limits, it is withdrawn from service though still useful life is available in the aircraft. In the absence of actual data available for each aircraft,rnpresent method is the only way to fly the aircraft safe at the cost of under utilization. In the recent years, much advancement has taken place in data sensing, capturing and processing. The computing platforms are available with very high reliable factor with much cheaper cost. With this technological advancement, it is possible to monitor the fatigue of all the aircraft structures dynamically and collect the actual data. The actual fatigue experienced by the aircraft during the usage period can be compared against the predicted value so that the life of an aircraft can bernextended without compromising the safety aspects. The proposed methodology is tested with a model aircraft and the readings are found to be consistent. The proposed system is one of the ways forward for optimal use of aircraft and scientific way of providing life extensions to the aircraft based on actual data rather than approximation of service life of aircraft fleet.
    Keywords: TTL; Aircraft; Total Technical Life; Under utilization; Life of the aircraft; safety; Arduino; Microcontroller;.

  • A Stable Routing Algorithm for Mobile Ad Hoc Network Using Fuzzy Logic System   Order a copy of this article
    by Helen  
    Abstract: Abstract The Mobile Ad Hoc Network (MANET) is an infrastructure-less network, where the nodes communicate either directly or indirectly through intermediate nodes. The network topology can change frequently due to its dynamic nature and limited resource availability. In MANET energy-efficient routing is a major issue because nodes are operated with a limited battery power. The energy-efficient routing algorithm can confirm the high performance by increasing the network lifetime. In order to make the network more scalable, the routing algorithm needs to maximize the usage of network resources. This paper proposes a novel routing approach Energy Aware Fuzzy Controlled Routing (EAFCR) algorithm. The proposed algorithm enriches the intelligence to the node by applying the fuzzy decision tools to develop a more stable and energy-efficient route during the route discovery phase. The fuzzy logic system uses the per hop delay, available energy and link quality to form a more stable route. With the proposed EAFCR algorithm, the packet delivery ratio, end-to-end delay, residual energy, and throughput show an improvement of 3.05%,1.38%, 4.25% and 3.3% respectively, than the existing Fuzzy Logic Modified AODV Routing (FMAR) protocol. Keywords: infrastructure-less, topologies, fuzzy decision, routing, protocol.
    Keywords: infrastructure-less; topologies; fuzzy decision; routing; protocol. rnrn.

  • Automatic Short Answer Grading using Rough Concept Clusters   Order a copy of this article
    by Udit Kr. Chakraborty, Debanjan Konar, Samir Roy, Sankhayan Choudhury 
    Abstract: Evaluation of text based answers has stayed as a challenge for researchers in recent years and with the growing acceptance of e-learning system, a solution needs to be achieved fast. While assessing the knowledge content, correctness of expression and linguistic patterns are complex issues in themselves, a smaller answer may be evaluated using keyword matching only. The work proposed in this paper is aimed at evaluating smaller text answers, no longer than a single sentence using keyword matching. The proposed method agglomerates keywords from a group of model answers forming clusters of words. The evaluation process thereafter exploits the inherent roughness of the keyword clusters to evaluate a learners response through comparison and keyword matching. The novelty in the proposed system lies in the usage of fuzzy membership functions along with rough set theory to evaluate the answers. Rigorous tests have been conducted on dataset built for the purpose returned good correlation values with the average of two human evaluators. The proposed system also fares better than Latent Semantic Analysis (LSA) based and Link Grammar based evaluation systems.
    Keywords: Text answer; Single Sentence; Keyword; Concept Cluster; Rough Set; Latent Semantic Analysis; Link grammar.

  • A hybrid grey wolf optimization and pattern search algorithm for automatic generation control of multi area interconnected power systems   Order a copy of this article
    by Vikas Soni, Girish Parmar, Mithilesh Kumar 
    Abstract: A hybrid grey wolf optimization-pattern search (hGWO-PS) algorithm has been proposed to optimize the parameters of two degree of freedom-proportional integral derivative (2DOF-PID) controllers in multi area power systems for automatic generation control. The integral of time multiplied by absolute error (ITAE) has been considered as an objective function in the present work. Firstly, this algorithm has been applied to two area non reheat thermal power system; secondly, the analysis of ITAE, dynamic responses and robustness of the same has also been carried out. The dynamic behaviour of the system optimized by the proposed approach hardly alters with the broad changes in the load and system parameters within the range [-50%, +50%]. The proposed algorithm has also been applied extensively to three area hydro-thermal power system with appropriate generation rate constraints (GRC). The simulation results show that the proposed algorithm performs better when compared with recently published approaches in terms of less ITAE value, settling time, overshoot and faster system ability to return at zero frequency and tie line power deviations.
    Keywords: Automatic generation control; two area parallel interconnected thermal power system; three area interconnected hydro thermal power system; two degree of freedom-PID controllers; grey wolf optimization; pattern search; generation rate constraints; governor dead band nonlinearities.

    by Rutuja Mote, Ambika Pawar 
    Abstract: Cloud is an umbrella wherein the internet based development and services are scrutinized and then explored. Cloud can be entitled as an enigma, wherein the novel opportunities are pioneered to manifest a large scale and flexible computing framework. The actors of a cyber supply chain can be commenced through the important functionalities such as, the utility model of consumption with elasticity, the abstraction of the framework and so on. Hybrid clouds vary greatly in sophistication facilitating portability of workloads across the entire inter-cloud, without compromising users availability, security, or performance requirements. This paper comprehensively helps to enhance a privacy design model with cloud computing adaptation hitting the fast lane. In the first phase, the system assimilates and devises the formation of hybrid cloud architecture. In the second phase, the system implements various security tactics that are Advanced Encryption Standard (AES) Technique, Byte Replacement Shuffling (BRS) algorithm in consonance with sensitivity level, assigned to the file to preserve privacy. The third phase delineates the optimization of response time (to upload and download a file) and workflow using Map-Reduce for data deduplication for a cavernous privacy and security solution.
    Keywords: Hybrid Cloud Architecture; File Upload; File Download; Byte Replacement Shuffling; Map-Reduce; Data Deduplication; Security; Privacy.

  • Possible Adoption of Various Machine Learning Techniques in Cognitive Radio-A Survey.   Order a copy of this article
    by Barnali Dey, Ashraf Hossain, Rabindranath Bera 
    Abstract: The concept of Cognitive Radio (CR) system is the need for next generation Wireless Communication technology in terms of providing intelligence and superior performance to a wireless device. The CR is mainly an intelligent system which is aware of its environment and is well capable to adapt in accordance with the changing environment and user needs. The concept of adaptation of the communication system can be realised well with machine learning capability inculcated within the system. It is a well known fact that, the key strengths of any Machine Learning paradigm is its ability to adapt with respect to the dynamic changing system parameters. In this paper an attempt has been made to compile various applications of machine learning techniques for different activities of CR cycle. Further, this note reviews the work on development of machine learning techniques for spectrum sensing of CR in order to make the CR system as a whole practically feasible and robust, thus mitigating its existing computational limitations due to the use of conventional techniques.
    Keywords: Cognitive Radio; Machine Learning; Spectrum Sensing; Energy Detection.

  • Raga Recognition through Tonic Identification using Flute Acoustics   Order a copy of this article
    by Sinith M S, Shikha Tripathi, Murthy K V V 
    Abstract: Tonic identification is traditionally approached using pitch histogram. Acoustic characteristics of musical instruments have not been used for the purpose. The conventional tonic identifiers are either knowledge based or multi-pitch analysis based. These methods either directly or indirectly depend on the drone sound. The efficiency of these systems drastically decreases in the absence of the later. In this paper, a Tonic Identification method which is independent of drone sound is proposed for flute signals which makes use of acoustic characteristics of the instrument. In addition, tonic identification is utilized for real- time raga recognition.
    Keywords: Tonic identification; Indian Classical Music; Raga recognition; Flute acoustics.

  • Performance improvement in Cardiology department of a hospital by Simulation   Order a copy of this article
    by Shriram K. Vasudevan, Narassima Seshadri, Anbuudayasankar SP, Thennarasu M 
    Abstract: Healthcare industry plays a vital role in life of humankind and in economic development of a country. Healthcare services have to be provided to mankind as and when required without time delay and compromise on quality. This research focusses on reduction of waiting time of patients as it is considered as one of the important parameters that governs the service quality and is considered to improve patient satisfaction. This was achieved by performing a case study in Cardiology outpatient department of a private hospital in South India. Cardiology was chosen as it is one of the most critical areas which demands immediate attention. The study follows a Discrete Event Simulation approach for analysing the trajectory of patients in cardiology department, determining various performance parameters, suggesting changes in the existing system and developing alternate models to compare the results with those of existing model. Reducing waiting time permits physicians to address more number of patients in a given period which are evident from the results obtained from the developed models. Simulation results revealed that the four alternate systems proposed were effective than the existing system.
    Keywords: Discrete Event Simulation; Arena model; Healthcare; Cardiology; Outpatient department; Waiting time reduction;.

  • Computing the Shortest path with Words   Order a copy of this article
    by Arindam Dey, Anita Pal 
    Abstract: Computing with Words is a soft computing technique to solve the decision making problem with the information described in natural language. It is inspired by the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements or computations. In this paper, we propose a generalized Diskrtras algorithm to solve the shortest path problem from a specific node to every other nodes on a fuzzy graph, in which words taken from natural language are assigned to the arcs as their arc length.rnWe call this problem as computing the shortest path with word (CSPWW). In arnshortest path problem, the arc lengths may represent time or cost. Human beingrndescribe those arc costs in real life by the terms small, large, some, etc termrnwhich do not supply any natural numbers or fuzzy numbers. We describe thosernterms as words. Same word may have different meaning to different people. So,rnuncertainty appears in description of the word in natural language. Here, we usernInterval Type 2 Fuzzy Set (IT2FS) to capture the uncertainty of the words. Arnperceptual computer model is introduced to use in our algorithm. The Per-C thatrnis associated with shortest path problem is called a shortest path advisor (SPA),and its design is described in detail in this paper. It consists of three components:encoder, CWW engine and decoder. The encoder receives all the words present in the path and transforms all the words into IT2FSs. The CWW engine adds all the IT2FSs and returns an IT2FS for the corresponding path. The decoder receives the output of CWW engine and calculates the corresponding centroid based ranking value of the path. This rank is used to determine the shortest path. A numerical example of transportation network is used to illustrate the effectiveness of the proposed method.
    Keywords: Computing with words; Interval type-2 fuzzy sets; perceptualrncomputer; centroid rank.

  • An Analysis of the Most Accident Prone Regions within the Dhaka Metropolitan Region Using Clustering   Order a copy of this article
    by M. Rashedur Rahman 
    Abstract: Most of the worlds developed countries have decreased the unusual deaths like traffic accidents of their citizens by taking efficient steps. In Bangladesh, injuries because of road accidents have become a regular incident. The highly-populated cities in Bangladesh are still having such incidents daily. As the number of vehicles is increasing and most of the drivers are not willing to follow the traffic rules, injuries due to traffic accidents are not going down at all. Among all those big cities in Bangladesh, Dhaka city has highest amount of road accidents. So, in this paper we focus on the most hazardous regions in Dhaka Metropolitan area. We have collected the accident related data from Accident Research Institute (ARI) at Bangladesh University of Engineering and Technology (BUET) that is located in the city of Dhaka. In our paper, we have used the Fuzzy C-means Clustering, Expectation Maximization, Hierarchical Agglomerative Clustering and K-means Clustering to identify the regions where traffic incidents occur the most in Dhaka Metropolitan area. The missing values for some attributes in the dataset are overwritten by the mean/mode of that attribute itself.
    Keywords: data mining; accidental injury severity; clustering; hazardous areas; dhaka metropolitan area.

  • A Statistical Comparison for Evaluating the Effectiveness of Linear and Nonlinear Manifold Detection Techniques for Software Defect Prediction   Order a copy of this article
    by Soumi Ghosh, Ajay Rana, Vineet Kansal 
    Abstract: Software systems are associated with the common problem of having wide range of defects. Nowadays, most of the software systems are released without predicting any defect, and therefore, it is very essential to predict the defects in time for improving the software qualities, security and obtaining the desired result at minimum cost. This may be possible if, defects in a software system can be predicted in the initial stage of software development process by application of proper and effective techniques. This paper presents, a new technique-Manifold Detection Technique (MDT) which is different than earlier applied conventional methods like Regression, Feature Selection Methods etc. that has been used for software defect prediction. In this paper, the performance of classifiers has been compared when applied with or without MDTs in order to evaluate the effectiveness of different MDTs (Linear & Nonlinear) by reducing the dimensions of software datasets. In this process, eight classifiers were applied to four PROMISE datasets to determine the best performing classifier with respect to prediction performance measuring factors (accuracy, precision, recall, F-Measure, AUC, misclassification error) are bias reduced by use of 10-fold cross validation test, when applied with or without MDTs. The experimental results proved that FastMVU is the most accurate result producing technique as compared to all other Nonlinear MDTs, when applied with any defective software dataset. A comparative analysis and evaluation of prediction performance of all classifiers demonstrated that Bayesian Network (BN) is the most effective technique for software defect prediction using with (Linear & Nonlinear) or without MDTs. The performance of all the classifiers with/without MDTs has been statistically analyzed and tested by performing paired two-tailed t-test.
    Keywords: Defects; Linear; Nonlinear; Manifold Detection; Promise Datasets; Prediction; Software System.

  • Modified SVPWM Technique for a Sensorless Controlled Induction Motor Drive using Neural Network Observer and Predictive Controller   Order a copy of this article
    by Shoeb Hussain, Mohammad Abid Bazaz 
    Abstract: The use of multi-level inverter in a sensorless control scheme increases reliability in state parameter estimation. In this paper, sensorless control is presented using a neural network observer that uses the direct and quadrature current and voltage components for speed estimation. Distortion in current and voltage will result in deviations in speed estimation. In order to address the problem, this paper presents a modified space vector modulation scheme for sensorless control of induction motor drive fed by a multi-level inverter. The modulation scheme uses lesser switching states and is employed on a cascaded H-bridge inverter configuration. This results in reliable speed estimation by reducing distortion in current and voltage measurement. Moreover the paper uses predictive controller for speed control. Simulation is carried out in MATLAB and results show improved performance of sensorless operation.rnrn
    Keywords: Induction motor; predictive controller; neural network observer; Sensorless Vector control; SVPWM.

  • Determination of Reliability Index of Cantilever Retaining Wall by RVM, MPMR and MARS   Order a copy of this article
    by Pijush Samui, Rahul Kumar, Sunita Kumari, Sanjiban Sekhar Roy 
    Abstract: Overturning criterion is an important parameter for designing cantilever retaining wall. This study adopts Relevance Vector Machine (RVM) based First Order Second Method (FOSM), Minimax Probability Machine Regression(MPMR) based FOSM and Multivariate Adaptive Regression Spline (MARS) based FOSM for determination of reliability index of cantilever retaining wall based on overturning criterion. RVM, MPMR and MARS have been used to overcome the limitations of the FOSM model. An example is illustrated how the proposed RVM based FOSM, MPMR based FOSM and MARS based FOSM analysis can be carried out. A comparative study has been carried out between the developed models. The results demonstrate that the developed models have ability to overcome the limitations of FOSM
    Keywords: Retaining Wall; Reliability; First Order Second Moment Method; Minimax Probability Machine Regression; Relevance Vector Machine; Multivariate Adaptive Regression Spline.

  • Thumb Movement for Prosthetic Hand based Fuzzy Logic   Order a copy of this article
    by Anilesh Dey, Amarjyoti Goswami, Abdur Rohman, Jamini Das, Nilanjan Dey, Amira S. Ashour, Fuqian Shi 
    Abstract: Electromyography innovation leads to the development of modern prostheses (artificial limbs) control. Prosthetic hands are developed to assist amputees during their daily activities. Over the years, it is seen that the fluid movements which are required to carry out different functions, such as gripping and holding are not reaching its full potential, especially in thumb movement pattern. Consequently, the current work proposed an efficient mechanism for the movement of the prosthetic thumb in order to position the thumb even at intermediate angles as 45.3 degrees and 78.6 degrees. Obtaining such flexibility in the movement will lead to a movement pattern which is more similar to the human hand. A fuzzy-based control strategy is implied to design a prosthetic thumb with the above-mentioned movement pattern. The Mamdani fuzzy control model is proposed with three input variables, namely the thumbs first joint bend, second joint bend and the second joint movement in the left and right direction. The proposed system provided the expected results, where twenty-seven combinations of the rules facilitate the alignment of the prosthetic thumb at different degrees.
    Keywords: Intermediate movements; Mamdani fuzzy control; Prosthetic thumb movement.

  • QoS-Aware Online Mechanism for Dynamic VM Provisioning in Cloud Market Using Q-learning   Order a copy of this article
    by Ayoub Alsarhan 
    Abstract: Cloud provider (CP) leases various resources such as CPUs, memory, and storage in the form of Virtual Machine (VM) instances to clients over internet. This paper tackles the issue of quality of service (QoS) provisioning in cloud environment. We examine using Q-learning for provisioning VMs in the cloud market. The extracted decision function should decide when rejecting new request for VMs that violate QoS guarantee. This problem requires the reward for CP be maximized while simultaneously meeting a quality of service (QoS) constraints. These complex contradicting objectives are embedded in our Q-Learning model that is developed and implemented as shown in this paper. Numerical analysis shows the ability of our solution to earn significantly higher revenue than alternatives.
    Keywords: Quality of Service; Cloud Computing; Resource Management; Q-learning; Cloud Service Trading.

  • Using modified background subtraction for detecting vehicles in Videos   Order a copy of this article
    by Mohamed Maher Ata, Mohamed El-Darieby, M.Abd Elnaby, Sameh A. Napoleon 
    Abstract: In this paper; a comparison study has been introduced between the traditional foreground detector based (background subtraction technique) and a modified background subtraction based (empty frame subtraction technique). Our case study was estimating average vehicular speed and the level of crowdedness in 3 test traffic videos with 5 different indices; frame rate, resolution, number of frames, duration, and extension). The proposed modification in the background subtraction detector strategy aims to reduce vehicle detection processing time which increase vehicle tracking efficacy. In addition, we have applied some sort of video degradations (salt and pepper noise, Gaussian noise, and speckle noise) to the appropriate traffic videos in order to evaluate the effect of a challenging weather condition case study on the detection processing time. This degradation has been applied in both traditional and modified background subtraction for detecting vehicles in traffic videos. Results show an obvious enhancement in the processing time of the detected vehicles according to this modification in the background subtraction of interest rather than the traditional background detector.
    Keywords: computer vision; foreground object detection; background subtraction; video degradation.

  • An Efficient prefix based labeling scheme for Dynamic update of XML Documents   Order a copy of this article
    by Dhanalekshmi Gopinathan, Krishna Asawa 
    Abstract: The increasing volume of XML documents and the real-world requirement to support the updations has motivated the research community to develop dynamic labeling schemes. Each of the dynamic labeling schemes proposed till date differs in characteristics and has its own advantages and limitations. They may differ in terms of the query supported, their update performance, label size etc. In this paper, a new prefix based labeling scheme is proposed which is compact, dynamic. And, it also facilitates the computation of structural relationships which is the core part of query processing. The proposed scheme can handle both static as well as dynamic XML documents. The experimentation is conducted to evaluate the performance of storage requirement, structural relationship computation and update processing. The result is compared with some of the existing labeling mechanisms.
    Keywords: Labeling Scheme; XML; Structural relationship; dynamic update; ancestor-descendant; parent-child relationship.

  • Content based load balancing of tasks using task clustering for cost optimization in cloud computing environment   Order a copy of this article
    by Kaushik Sekaran, Venkata Krishna P 
    Abstract: Cloud computing is the recent mantra for all the techies and internet users all around the world. The power of cloud computing is enormous as it provides big services in an optimal cost as well as in a reliable manner. Load balancing of tasks in the cloud server is an important issue to be addressed. In this paper, we propose a task clustering algorithm to minimize the load across the cloud servers through content based load balancing of tasks using task clustering methods and cost reduction method for optimal energy consumption at all the cloud data center heads. The results analysed in our paper are better when compared with existing content based load balancing models. Our approach clearly represents the achievement of optimal load balancing of tasks with respect to upload bandwidth utilization, minimal latency and some other QoS (Quality of service) metrics.
    Keywords: Cloud computing; load balancing; tasks clustering; cost reduction; energy consumption; QoS (Quality of service) metrics.

  • A Two Step Clustering Method for Facility Location Problem   Order a copy of this article
    by Ashish Sharma, Ashish Sharma, A.S. Jalal, Krishna Kant 
    Abstract: Facility location problems are designed with the objective to gain more profit. The profit can be gained when the maximum demand is satisfied. The demand can be satisfied when maximum number of customers are covered or served. To attain maximum number of customers, there are various approaches have been investigated. In general, most of the approaches consider for the facility location models are based on radius as a service area of facility. Therefore, such facilities which fulfill their service in a radius can be served by conventional approach. However, conventional approaches fail to allocate those facilities which are not inclined by topographical and road network barriers. In this paper, we propose a model to optimized facility allocation in such scenarios. In the propose model, we have used a two step clustering approach to solve the facility location problem. Experimental results illustrate that the proposed algorithm based on density affinity propagation (DAP) for the Facility location problem can be used to construct a solution for maximal service and covering area.
    Keywords: Facility location; Proximity; Density; Approximation; Clustering.

  • Marker and Modified Graph Cut Algorithm for Augmented Reality Gaming.   Order a copy of this article
    by Shriram K. Vasudevan, R.M.D. Sundaram 
    Abstract: Augmented reality aims at superimposing a computer generated image on a users view of the real world thereby creating a composite view. Virtual reality on the other hand keeps the user isolated from the real world and immersed in a world that is completely fabricated. The main objective of this research is to capture a real life image and augment it as a component of a gaming environment using the principles of augmented reality. For this research implementation, we have chosen car racing as our gaming environment. The core elements are the image segmentation using CIELAB color space based graph cut algorithm, 2D to 3D modelling, and game development with augmented reality. The tools utilised are Mat Lab, insight3d and Unity3D.The proposed idea will enable someone to view a virtual environment with real components that are integrated dynamically.
    Keywords: Augmented Reality; Gaming; Image extraction; Modelling; Image segmentation; Racing.

  • Predicting longitudinal dispersion coefficient in natural streams using Minimax Probability Machine Regression and Multivariate Adaptive Regression Spline   Order a copy of this article
    by Sanjiban Sekhar Roy, Pijush Samui 
    Abstract: This article employs Minimax Probability Machine Regression(MPMR) and Multivariate Adaptive Regression Spline(MARS) for prediction of longitudinal dispersion coefficient in natural streams.The variables of hydraulic features such as channel width(B),flow depth(H), flow velocity(U), shear velocity(u*) and geometric features such as channel sinuosity (σ) and channel shape parameter(β) were taken as the input.The dispersion coefficient Kx was the decision parameter for the proposed machine learning models.MARS does not assume any functional relationship between inputs and output.The MARS model is a non-parametric regression model that splits the data and fits each interval into a basis function.MPMR is a probabilistic model which maximizes the minimum probability of predicted output. MPMR also provides output within some bound of the true regression function.The proposed study gives an equation for prediction of Longitudinal Dispersion Coefficient based on the developed MARS. The developed MARS has been compared with proposed MPMR. Finally, the performances of the models have been measured by different performance metrics.
    Keywords: Longitudinal Dispersion Coefficient;Natural Streams;Minimax Probability Machine Regression;Prediction; Multivariate Adaptive Regression Spline.

  • A Brain-like Cognitive Process with Shared Methods   Order a copy of this article
    by Kieran Greer 
    Abstract: This paper describes a new entropy-style of equation that may be useful in a general sense, but can be applied to a cognitive model with related processes. The model is based on the human brain, with automatic and distributed pattern activity. Methods for carrying out the different processes are suggested. The main purpose of this paper is to reaffirm earlier research on different knowledge-based and experience-based clustering techniques. The overall architecture has stayed essentially the same and so it is the localised processes or smaller details that have been updated. For example, a counting mechanism is used slightly differently, to measure a level of cohesion instead of a correct classification, over pattern instances. The introduction of features has further enhanced the architecture and the new entropy-style equation is proposed. While an earlier paper defined three levels of functional requirement, this paper re-defines the levels in a more human vernacular, with higher-level goals described in terms of action-result pairs.
    Keywords: Cognitive model; distributed architecture; entropy; neural network; concept tree.

  • Cross-corpus Classification of Affective Speech   Order a copy of this article
    by Imen Trabelsi, Mohammed Salim Bouhlel 
    Abstract: Automatic speech emotion recognition still has to overcome severalrnobstacles before it can be employed in realistic situations. One of these barriersrnis the lack of suitable training data, both in quantity and quality. The aim of thisrnstudy is to investigate the effect of cross-corpus data on automatic classification ofrnemotional speech. In thiswork, features vectors, constituted by the Mel FrequencyrnCepstral Coeffcients (MFCC) extracted from the speech signal are used to trainrnthe Support Vector Machines (SVM) and Gaussian mixture models (GMM). Wernevaluate on three different emotional databases from three different languagesrn(English, Polish, and German) following a three cross-corpus strategies. In thernintra-corpus scenario, the accuracies were found to vary widely between 70%rnand 87%. In the inter-corpus scenario, the obtained average recall is 70.87%. Thernaccuracies of the cross-corpus scenario were found to be below to 50%.
    Keywords: Cross corpus strategies; Speech emotion recognition; GMM; SVM;rnMFCC.

  • GA based efficient Resource allocation and task scheduling in multi-cloud environment   Order a copy of this article
    by Tamanna Jena, Jnyana Ranjan Mohanty 
    Abstract: Efficient resource allocation to balance load evenly in heterogeneous multi-cloud computing environment is challenging. Resource allocation followed by competent scheduling of tasks is of crucial concern in cloud computing. Load balancing is assigning incoming job-requests to resources evenly so that each involved resources are efficiently utilized. Number of cloud users are immense and volume of incoming job-request is arbitrary and data is enormous in cloud application. In cloud computing resources are limited, therefore it is challenging to deploy various applications with irregular capacities as well as functionalities in heterogeneous multi-cloud environment. In this paper Genetic Algorithm based task mapping, followed by priority scheduling in multi-cloud environment is proposed. The proposed algorithm has two important phases, namely mapping and scheduling. Performed rigorous simulations on synthetic data for heterogeneous multi-cloud environment. Experimental results are compared with existing First In First Out (FIFO) mapping and scheduling. Validity of mapping and scheduling clearly proves better performance of the entire system in terms of makespan time and throughput.
    Keywords: Load Balancing; Task Scheduling; Cloud Computing; multi-cloud environment; Genetic Algorithm.

  • Using Artificial Intelligence Techniques in Collaborative Filtering Recommender Systems: Survey   Order a copy of this article
    by Yousef Kilani, Bushra Alhijawi, Ayoub Alsarhan 
    Abstract: The Internet currently contains a huge data which is exponentially growing. This leads to the problem of information overload that makes the task of searching for information difficult and time consuming. Recommendation system is a filtering technique that recommend items to the users in order to reduce the list of choices and hence saves their times. There are two types of algorithms for building the recommender systems: collaborative filtering methods and content-based filtering methods. It is a common knowledge that the collaborative filtering recommendation algorithm is one of the most commonly used recommendation algorithms. Therefore, our interest in this work is in the collaborative filtering algorithms. There are many type of algorithm used to build the RS includes data mining techniques, information retrieval techniques and artificial intelligence algorithms. Although a number of studies have developed recommendation models using collaborative filtering, few of them have tried to adopt both CF and other artificial intelligence techniques, such as genetic algorithm, as a tool to improve recommendation results. This survey presents the state-of-the-art artificial intelligence techniques used to build the collaborative filtering recommender systems. These techniques include fuzzy algorithms, genetic algorithms, ant colony algorithms, swarm optimization algorithms, neural network algorithms, and machine learning algorithms.
    Keywords: Recommendation system; web intelligence; artificial intelligence ; survey;.

  • Efficient and Secure Approaches for Routing in VANETs   Order a copy of this article
    by Marjan Kuchaki Rafsanjani, Hamideh Fatemidokht 
    Abstract: Vehicular ad hoc networks (VANETs) are a particular type of Mobile Ad Hoc Networks (MANETs). These networks provide communication services between nearby vehicles and between vehicles and roadside infrastructure that improve road safety and provide travelers' comfort. Due to the characteristics of VANET, such as self-organization, low bandwidth, variable network density, rapid changes in network topology, providing safe driving, enhancing traffic efficiency, etc., and the applications of them, problems related to these networks, such as routing and security, are popular research topics. A lot of research has been performed for providing efficient and secure routing protocol. In this paper, we investigate and compare various routing protocols based on swarm intelligence and key distribution in VANET.
    Keywords: Vehicular ad hoc networks (VANETs); Swarm intelligence; Routing protocols; Cryptography.

  • Fuzzy Project Scheduling with Critical Path Including Risk and Resource Constraints Using Linear Programming   Order a copy of this article
    by Shahram Saeidi, Samira Alizadeh Aminloee 
    Abstract: Project scheduling is one of the important issues of project management which has raised the interest of the researches and several methods have been developed for solving this problem. While certain models are used in most studies, uncertainty is one of the intrinsic properties of most projects in real world which consist some activities with uncertain processing times and resource usages. In this paper, a fuzzy linear programming model is proposed for project scheduling considering risk and resources constraints under uncertain environment in which activity duration and the amount of resources used by each activity is defined as a fuzzy membership function. The proposed model is simulated in MATLAB R2009a software and four test cases adopted from the literature are implemented. The computational results show that the proposed model decreases the critical path length about 4% in competition with similar methods.rnrn
    Keywords: Fuzzy Project Scheduling; Critical Path; Linear Programming.

  • Analysis of Energy Efficiency Based on Shortest Route Discovery in Wireless Sensor Network   Order a copy of this article
    by Mohit Mittal 
    Abstract: Todays scenario is totally based on advancement of existing technologies to get more reliable wireless communication. Wireless sensor networks are one of the popular emerging technologies that are deployed commonly in harsh environment. These networks main dependency is on battery powers. Our mission is to reduce the energy consumption as much as possible. Every routing protocol has been designed for sensor network based on minimum energy consumption. In this paper, LEACH protocol has been modified with various shortest path algorithms to find out best performance of sensor network. Simulation result shows that Dijsktra algorithm has found to be better among other algorithms.
    Keywords: LEACH; Energy efficiency; Bellman-ford algorithm; Dijkstra algorithm; BFS algorithm.

  • Optimum Generation and VAr Scheduling on a Multi-Objective Framework using Exchange Market Algorithm   Order a copy of this article
    by Abhishek Rajan, T. Malakar 
    Abstract: This paper presents an application of Exchange Market Algorithm (EMA) in solving multi-objective optimization problems of power systems. This optimization algorithm is based on the activities of shareholders to maximize their profit in the Exchange Market. The uniqueness of this algorithm lies in the fact that, it enjoys double exploitation and exploration property unlike several other algorithms. In order to investigate its search capability, the EMA is utilized to solve power systems active and reactive related objectives simultaneously in presence of several non-linear constraints. Both optimum generation and VAr planning problems are formulated as conventional Optimal Power Flow (OPF) problem. Fuel cost (Active related objective), Transmission Line Loss and Total Voltage Deviation (reactive related objectives) are taken as different objective functions. The multi-objective optimization problem is performed through weighted sum approach. Both fuzzy and equal weight approach is utilized to declare the compromised solution. Programs are developed on MATLB and simulations are performed on Standard IEEE-30 & IEEE-57 bus systems. The search capability of EMA in solving the multi-objective power system problems are compared with PSO based solutions.
    Keywords: Optimal Power Flow; Exchange Market algorithm; multi-objective optimization; Pareto front; fuzzy decision making.

  • A Novel Three-Tier Model with Group Based CAC for Effective Load Balancing in Heterogeneous Wireless Networks   Order a copy of this article
    by Kalpana S, Chandramathi S, Shriram KV 
    Abstract: Seamless and ubiquitous connections are the ultimate objectives of 4G technologies. But due to randomised mobility and different service class of applications, the connection failure rate increases, which can be overcome through handover (HO). With the increased demand for handovers, the number of networks scanned for decision making and the number of negotiations for connectivity become too large. To improve their efficiency, a three tier model is proposed, where requests for similar type are grouped and a common negotiation is made to reduce the number of communication messages. Only qualified networks among all the reachable access points are chosen for decision. Handover need estimation is performed to reduce the unwanted handovers. Finally, an adaptive resource management is made possible through a group based call admission control (GB-CAC) algorithm that harmonises up to 50 percent of the resource utilisation, ensuring higher numbers of connections with negligible percent call blocking and dropping.
    Keywords: Point of Attachment; handover; candidate networks; elimination factor; queues; Quality of Service; Smart Terminal.

  • Knowledge based Semantic Discretization using Data Mining Techniques   Order a copy of this article
    by Jatinderkumar R. Saini, Omprakash Chandrakar 
    Abstract: Discretization is an important and, sometimes, an essential pre-processing step for data mining. Certain data mining techniques such as Bayesian networks, induction rules or association rule mining can be applied only on discretized nominal data. Various studies show significant improvement for certain data mining techniques, when applied on discretized data rather than continuous data. Several discretization methods have been reported in literature, which are based on statistical techniques. Such statistical techniques are inadequate in capturing and exploiting the underling knowledge inherent in data and context of study. Big data with high dimension, and unavailability of any a priori knowledge on the study context, even make the situation miserable. To overcome this limitation, we propose a novel knowledge based semantic discretization method using data mining techniques, in which discretization is done based on Semantic data. Semantic data is domain knowledge inherent in the data itself and context of the study. Unlike semantic data mining, no explicit ontology associated with the data for semantic discretization. Therefore, its a challenging task to identify, capture, interpret and exploit the semantic data for semantic discretization. This study presents the novel concept of semantic discretization and demonstrate the application of data mining techniques in extracting semantic data, which is further used in knowledge based semantic discretization. We show the effectiveness of the proposed methodology by applying it on Pima Indian Diabetes dataset, which is a standard dataset, taken from UCI Machine learning repository.
    Keywords: Association rule mining; Data mining; Discretization; Machine learning; Pima Indian Diabetes Dataset; Prediction Model; Semantic Discretization; Type-2 Diabetes.

  • OMCM-CAS: Organizational Model and Coordination Mechanism for Self-adaptation and Self-organization in Collective Adaptive Systems   Order a copy of this article
    by Ali Farahani, Eslam Nazemi 
    Abstract: The complexity of Information systems has grown in past decades and dealing with this complexity become a hot research field in computer science. One of the solutions for dealing with systems complexity and environmental changes is self-managing. It has announced under the term of autonomic computing by IBM at 2001. In recent years the using self-managing approaches in distributed systems without central control is trending. Self-organizing is known for its usage in distributed systems; hence, self-adaptation is mostly used in centralized systems. For having these two concepts alongside each other, self-adaptive concepts are combined with self-organization, an interdisciplinary term with applications in several fields. Different usages and definitions are provided for this term and also for its relation with self-adaptive systems. These differences have led to an ambiguity in this domain. Collective Adaptive System (CAS) is a distributed system which has heterogeneous agents with different capabilities in large scale. This research field cover a big majority of distributed systems. Having self-adaptiveness in CAS can address problems about coordination and cooperation of agents. This research compares self-organization with self-adaptation in a broader view and identifies the differences and correlations. Also, it considers the applicability of coordination, reflection and architectural approaches in both domains and presents a hybrid approach. Organizational models for self-organization in distributed environment have studied and have got analyzed. A new combined organizational model has been introduced based on the benefits and weak points of current organizational models. Based on presented organizational model, a coordination mechanism has been introduced for facilitating the cooperation in CAS. A Case study (NASA ANTS mission) have been discussed and simulated and simulation results support the applicability and effectiveness of the presented organizational model and coordination mechanism.
    Keywords: Self-organization; Self-adaptation; Intelligent distributed system; Decentralized control; Coordination Mechanism.

  • Intricacies in Image steganography and Innovative Directions   Order a copy of this article
    by Krishna Veni, Sudhakar P 
    Abstract: With the advancement in digital communication and data sets getting huge due to computerization of data gathering worldwide, the need for data security in transmission also increases. Cryptography and steganography are well known methods available to provide security where the former use techniques that control information in order to cipher or hide their presence and the latter concentrates on data concealment. Steganography is the practice of masking data especially multimedia data within another data. Visual contents gets more importance from people compared to audio contents and moreover visual content file is huge when compared to audio file thereby helping increase robustness of the hiding algorithms. In this paper, we consider three domains in which the image steganography algorithms are proposed along with the experimentation results on USC-SIPI image database which prove the betterment of the algorithms as compared with the traditional algorithms. We propose to use rule based LSB substitution method in spatial domain, XOR based hiding in frequency domain and data encryption standard based embedding in wavelet domain. We find that the proposed algorithms have a better PSNR value averaging close to 53 after embedding the secret data, while the existing algorithms has values of around 50.
    Keywords: Peak Signal to Noise Ratio; Quantization; Discrete Cosine Transformation; Wavelet; Steganalysis; cipher text;.

  • Fuzzy Soft Set Approach for Classifying Malignant and Benign Breast Tumors   Order a copy of this article
    by Sreedevi Saraswathy Amma, Elizabeth Sherly 
    Abstract: Breast cancer is one of the most common health problems faced by women all over the world and mammography is an effective technique used for its early detection. This work is concentrated on developing machine learning algorithms combined with a mathematical model for classifying malignant or benign images in digital mammograms. The mathematical concept of the fuzzy soft set theory is advocated here, which is an extension of crisp and fuzzy with parameterization. Even though fuzzy and other soft computing techniques have made great progress in solving complex systems that involve uncertainties, imprecision and vagueness, the theory of soft sets open up a new way for managing uncertain data with parameterization. The classification is performed by using fuzzy soft aggregation operator to identify the abnormality in a mammogram image as malignant or benign. This work is a fully automated computer aided detection method which involves automated noise removal, pectoral muscles removal, segmentation of ROI, identification of micro-calcification clusters, feature extraction and feature selection followed by classification. The experiment is performed on images from MIAS dataset resulted in 95.12% accuracy.
    Keywords: Digital Mammography; computer-aided diagnosis (CAD); fuzzy soft set theory; fuzzy c-means; NL-means; fuzzy soft aggregation operator.

  • The performance comparison of improved continuous mixed P-norm and other adaptive algorithms in sparse system identification   Order a copy of this article
    by Afsaneh Akhbari, Aboozar Ghaffari 
    Abstract: One of the essential usages of adaptive filters is in sparse system identification on which the performance of classic adaptive filters is not acceptable. There are several algorithms that designed especially for sparse systems, we call them sparsity aware algorithms. In this paper we studied the performance of two newly presented adaptive algorithms in which P-norm constraint is considered in defining cost function. The general name of these algorithms is continuous mixed P-norm (CMPN). The performances of these algorithms are considered for the first time in sparse system identification. Also the performance of l_0 norm LMS algorithm is analyzed and compared with our proposed algorithms. The performance analyzes are carried out with the steady-state and transient mean square deviation (MSD) criterion of adaptive algorithms. We hope that this work will inspire researchers to look for other advanced algorithms against systems that are sparse.
    Keywords: Adaptive algorithms; sparse; mixed P-norm; system identification.

  • Using Lego EV3 to Explore Robotic Concepts in a Laboratory   Order a copy of this article
    by Jeffrey W. Tweedale 
    Abstract: During a recent Massive Open On-line Course (MOOC) at the Queensland University of Technology (QUT) titled an Introduction to Robotics, a young student used the forum to question the skills required to gain employment. The resounding response was the need for multiple disciplines that typically included mechatronics, software, mechanical and electrical/electronics engineering. Similarly the curriculum focused on professional systems and the scientific rigour involved in their evolution. This limits the growing community of enthusiasts and keen observers seeking greater involvement as they are often constrained by the lack of Science Technology Engineering and Maths (STEM) skill sets. For these reasons a means of accelerating the learning of key concepts is required as well as a mechanism of providing cheap and reliable access to the tools and techniques required to participate. AlthoughLEGOMindstorms is considered a toy that has traditionally been targeted toward the 8-14 year group of children, it does cater for enthusiasts and is increasingly being used to support STEM initiatives. Because of its low cost and availability, Mindstorms was recently used as the focal solution in the MOOC course to enable every student to demonstrate robotic concepts independent of the pre-requisite skills. This raises a new question about how useful LEGO can be used to explore robotic concepts in a laboratory. The course shows it can be used for sensor development and was successfully used to enhance conceptual learning for the uninitiated (enthusiast, interested observer, undergraduate, post-graduate and even those being integrated within the domain).
    Keywords: Cartesian Coordinates; Forward Kinematics; Inverse Kinematics; Lego; Mindstorms; Robotics.

  • Detection of Melanoma Skin Disease by Extracting High Level Features for Skin Lesions   Order a copy of this article
    Abstract: Melanoma is a very dangerous type of skin cancer as compare to others. It can be cured, when diagnosed in its early stage. The detection and diagnosis of skin cancer is difficult using earlier conventional methods. The accurate detection and diagnosis of melanoma is possible using suitable image processing techniques. High level features, measures asymmetry of skin lesion images. These features can be used to diagnose lesions as skin cancer (melanoma). This paper presents large set of low level features for analyzing skin lesions. The best classification is obtained by combining the low level feature set with the high level feature set. The result shows that this method can be used and further developed as a tool for detection and classification of skin cancer (melanoma).
    Keywords: Feature extraction; Feature descriptor; Melanoma; Skin lesion; Radial search.

  • Applying Genetic Algorithm to Optimize the Software Testing Efficiency with Euclidean Distance   Order a copy of this article
    by Rijwan Khan 
    Abstract: Software testing ensures that a developed software is error free and reliable for customer use. For verification and validation of software products, testing has been applied on these products in various different software industries. So before the delivery of the software to the customer, all the types of testing have been applied. In this paper, automatic test cases have been developed with the help of a genetic algorithm for data flow testing and these tests are divided in different groups using Euclidean distance. Elements of each group are applied on the data flow diagram of the program/software and all the du paths are found, covering the given test suits. New test suits are generated with the help of the genetic algorithm to cover all du-paths.
    Keywords: Software Testing; Automatic test cases; Data flow testing; Genetic Algorithm.

  • How can Reasoning improve ontology based Context-Aware system?   Order a copy of this article
    by Hatim Guermah, Tarik Fissaa, Bassma Guermah, Hatim Hafiddi, Mahmoud Nassar, Abdelaziz Kriouile 
    Abstract: Over the past two decades, the large evolution of software engineering, telecommunication and pervasive devices has lead to emergence of a new vision of development aiming at building systems to meet more complex and personalized needs known as Context-Aware systems. This type of systems is becoming the next computing paradigm in which infrastructure and services are sensitive to any change of the context, so that plays a crucial role to provide interactive intelligent environments. In parallel, Contextual Situation refers to a higher level of information inferred from different context data flow that can be extracted from physical and virtual sensors. The power of using Situation is lies in their ability to provide a simple and comprehensible representation of context property, which preserve the services that manipulate them from the complexity of sensor readings, data transmission errors and inferencing activities. In this work, we aim to explore the added value of using ontology-based reasoning, focusing on first-order logic and fuzzy logic, to produce contextual situations.
    Keywords: Context; Context-Aware; Situation; Semantic Web; Ontologies; Context modeling; First Order Reasoning; Fuzzy logic Reasoning; inference and Reasoning.

  • Fractional Inverse Full State Hybrid Projective Synchronization   Order a copy of this article
    by Adel Ouannas, Ahmad Taher Azar, Toufik Ziar 
    Abstract: Referring to fractional-order systems, thisrnpaper investigates the inverse full state hybrid projective synchronizationrn(IFSHPS) of non-identical systems characterized by different dimensions andrndifferent orders. By taking a master system of dimension $n$ and a slavernsystem of dimension $m$, the method enables each master system state to bernsynchronized with a linear combination of slave system states, where thernscaling factor of the linear combination can be any arbitrary realrnconstants. Based on fractional Lyapunov approach and stability theory ofrnlinear fractional-order systems, the method enables commensurate andrnincommensurate fractional-order systems with different dimension to bernsynchronized. Two different numerical examples are reported. The examplesrnclearly highlight the capability of the conceived approach in effectivelyrnachieving synchronized dynamics for any scaling constants.
    Keywords: Full state hybrid projective synchronization; Fractional chaos,rnIncommensurate and commensurate systems; Fractional Lyapunov approach.

  • Dominion Algorithm- A novel metaheuristic optimization method   Order a copy of this article
    by Bushra Alhijawi 
    Abstract: In this paper, a novel bio-inspired and nature-inspired algorithm, namely Dominion Algorithm is proposed for solving optimization tasks. The fundamental concepts and ideas which underlie the proposed algorithm is inspired from nature and based on the observation of the social structure and collective behavior of wolves pack in the real world. Several experiments were preformed to evaluate the proposed algorithm and examine the correlation between its main parameters.
    Keywords: Dominion Algorithm; Metaheuristic methods; Biologically-inspired algorithm; Artificial intelligence.

  • Fitness Inheritance in Multi-objective Genetic Algorithms: A Case Study on Fuzzy Classification Rule Mining.   Order a copy of this article
    by Harihar Kalia, Satchidananda Dehuri, Ashish Ghosh 
    Abstract: In this paper, the trade-off between accuracy and interpretability in fuzzy rule-based classifier has been examined through the incorporation of fitness inheritance in multi-objective genetic algorithms. The aim of this mechanism is to reduce the number of fitness evaluation spared by estimating the fitness value of the offspring individual from the fitness value of their parents. The multi-objective genetic algorithms with efficiency enhancement technique is a hybrid version of Michigan and Pittsburgh approaches. Each fuzzy rule is represented by its antecedent fuzzy sets as an integer string of fixed length. Each fuzzy rule-based classifier, which is a set of fuzzy rules is representedrnas a concatenated integer string of variable length. Our algorithm simultaneously maximizes the accuracy of rule sets and minimizesrntheir complexity (i.e., maximization of interpretability). As a result of adopting fitness inheritance, it minimizes the total fitness computation time (i.e., overall time to generate rule set). The accuracyrnis measured by the number of correctly classified training samples,rnwhile the rule complexity is measured by the number of fuzzy rulesrnand/or the total number of antecedent conditions of fuzzy rules. Thernefficiency enhancement technique such as fitness inheritance is usedrnto minimize the overall computation time of generating the rule set.rnWe examine our method through computational experiments on somernbenchmark datasets. The experimental outcome conforms that thernproposed method reduces the computational cost, without decreasingrnthe quality of the results in a significant way.
    Keywords: Classification; fuzzy classification; multi-objective genetic algorithm; fitness inheritance; accuracy; and interpretability.

  • Geometric Based Histograms for Shape Representation and Retrieval   Order a copy of this article
    by Nacera Laiche, Slimane Larabi 
    Abstract: In this paper, we present a new approach for shape representation and retrieval based on histograms. In the drawback of the proposed histograms descriptor, we consider the concept of curves points. This integration in the proposed histogram-based approach is quite different since geometric description is stored in histograms. The proposed description is not only effective and invariant to geometric transformations and deformations, but also is insensitive to articulations and occluded shapes as it has the advantage of exploring the geometric information of points. The generated histograms are then used to establish matching of shapes by comparing their histograms using dynamic programming. Experimental results of shape retrieval on different kinds of shape databases show the efficiency of the proposed approach when compared with existing shape matching algorithms in literature.
    Keywords: Log-polar histogram; Least squares curve; High curvature points; Shape description; Shortest augmenting path algorithm; Shape retrieval.

  • Improved Biogeography-based Optimization   Order a copy of this article
    by Raju Pal, Mukesh Saraswat 
    Abstract: Biogeography-based optimization (BBO) is one of the popular evolutionary algorithms, inspired by the theory of island biogeography. It has been successfully applied in various real world optimization problems such as image segmentation, data clustering, combinatorial problems, and many more. BBO finds the optimal solution by using two of its main operators namely; migration and mutation. However, sometimes it traps into local optimum and converges slowly due to poor population diversity generated by mutation operator. Moreover, single feature migration property of BBO gives poor performance for non-separable functions. Therefore, this paper introduces a new variant of BBO known as improved BBO (IBBO) by enhancing its migration and mutation operators. The proposed variant successfully improves the population diversity and convergence behavior of BBO as well as shows better solutions for non-separable functions. The performance of proposed variant has also been compared and analyzed with other existing algorithms over 20 benchmark functions.
    Keywords: Evolutionary algorithm; Biogeography-based optimzation; Migration operator; Mutation operator.

  • Sequential Pattern based Activity Recognition model for Ambient Computing   Order a copy of this article
    by GITANJALI J, Muhammad Rukunuddin Ghalib 
    Abstract: In the recent years, the human activity recognition gain popularity in ambient computing. The human activity recognition is composed of identifying the daily activities of the users by observing their actions. Action identification is more complex task from senor data generated by each sensor. In this paper, sequential pattern based activity recognition is proposed for identifying sequential patterns among actions on the given dataset. This support value is used as a parameter to validate the sequence. The experimental evaluation is performed on the real time dataset and it is observed that the sequential pattern approach is very beneficial in reducing the execution time and increasing the classification accuracy of the classifiers.
    Keywords: Action; Activity; sensor based data; sequence patterns; classifiers.

  • Evaluation of Large Shareholder’s Monitoring or Tunneling Behavior in Companies Accepted in Tehran Stock Exchange
    by sahar Mojaver 
    Abstract: Shareholders' wealth in the real world of finance is very important and focus on it has become very important in recent years. Although the purpose of each investment and consequently, the main purpose of each company has been maximizing shareholder wealth but over the past decades, most companies have not paid enough attention to it. Ownership composition, particularly the ownership concentration of majority shareholders is one of the most important factors influencing on the control and managing companies. When large shareholders or internal shareholders like managers have the capacity to control the company, they may have some incentives to get private benefits. Given the importance of monitoring and behavior of controlling shareholders, this study investigates the large shareholder’s monitoring or tunneling behavior in companies accepted in Tehran Stock Exchange. To do so, 125 companies over the period of 2010 to 2011 (a total of 750 years- company) are analyzed using systematic elimination sampling method. Results show that there is a significant relationship between large shareholder’s tunneling behavior and financial performance (return on equity and Tobin's Q indexes) in companies accepted in Tehran Stock Exchange, and this relationship is U shaped.
    Keywords: Tunneling Behavior, Large Shareholders, Companies Accepted in Tehran Stock Exchange.

  • A practical approach to Energy Consumption in Wireless Sensor Networks
    by Sonam Khera, Neelam Turk, Navdeep Kaur 
    Abstract: A Wireless Sensor Network (WSN) is network formed by large number of spatially distributed, wirelessly communicating sensor nodes deployed for remote environment monitoring. These networks are specifically deployed to perform various sensing operations like measurement of temperature, pressure, vibrations and humidity etc. in an environment where human intervention is not possible. Thus once deployed, the WSN starts performing its functions and consumes the energy from the limited power source installed in sensor nodes. Due to inaccessibility of sensor nodes, these power sources are non-replaceable, once the nodes are deployed in the physical environment. Therefore the energy consumption of sensor nodes plays significant role in determining the life of a WSN. Various studies have been undertaken using available simulation environments to increase the lifetime of the network by reducing the energy consumption. In our previous studies it has been observed that controlled software environment is created with the help of various modelling tools and simulators available like MATLAB, NS2, OMNET++ etc. Though the simulation and modelling done in the software environment has been found to be convenient in terms of scalability and for simulating various scenarios but it lacks the exposure to the real time issues faced during the actual deployment. We have written this paper based on our experience of creating a physical WSN test bed to get first hand information about of real time deployment. The test bed has been designed with an aim to understand practical aspects of energy consumption in sensor networks. It monitors the temperature at different locations in a building. In this paper we have also covered different scenarios to analyse the energy consumption in our WSN test bed.
    Keywords: WSN; wireless sensor network; sensor; energy efficiency; power consumption; sleep mode; testbed

  • Local Patterns for Offline Arabic Handwritten Recognition
    by Yasser Qawasmeh, Sari Awwad, Ahmed Otoom, Feras Hanandeh, Emad Abdallah 
    Abstract: Off-line recognition of Arabic handwritten text is a challenging problem due to the cursive nature of the language and high inter and intra variability in the way of writing. Majority of the existing approaches are based on structural and statistical features and are constrained for a specific task with vast amount of pre-processing steps. In this paper, we explore the performance of local features for unconstrained offline Arabic text recognition with no prior assumptions or pre-processing steps.Our approach is based on local SIFT features. To capture important information and remove any redundancy, we apply a fisher encoding algorithm, and a dimensionality reduction approach, Principle Component Analysis (PCA) . The resulted features are combined with a contemporary Support Vector Machine (SVM) classifier and tested on a dataset of 12 different classes. There has been great improvements in recall and precision values in comparison with that of SIFT features alone or with that of SIFT features and other encoding algorithms, with more that 35% improvements when tested with 5-fold cross-validation test.
    Keywords: Local Features;  Offline Recognition;  Arabic Handwriting;  Fisher Encoding;

  • A Supervised Learning Approach for Link Prediction in Complex Social Networks
    by Upasana Sharma 
    Abstract: The use of internet based social media for establishing links with family, friends and customers has become very popular. In current scenario, social networking is being used for social and business purpose such as facebook, twitter and LinkedIn. New link is being created in every fraction of second. To predict the future link is a major challenge in link prediction domain. Various techniques have been proposed in past that are based on Similarity, Maximum likelihood estimation and Machine learning. The focus of this work is on supervised machine learning approach for link prediction in complex social networks. In past, many researchers have been worked on supervised approach by using only unweighted networks. Our aim is to assign weight to each connection in the network. Weight represents the strength of the connection and it improves the accuracy of the link predictor. This paper introduced a new approach using closed triangle concept to recommend future links in social networks. Extensive experiments have been performed on real YouTube data set and the proposed technique performs well.
    Keywords: Link Prediction; Social Networks; Artificial Neural Network; Supervised Learning Approach; Learning Algorithms

  • Trust Based Quality Awareness Using Combinatorial Auction Web Service Selection In Service Based Systems
    by Suvarna Pawar, Prasanth Yalla 
    Abstract: The service-oriented paradigm offers support for engineering service-based systems (SBSs) based on service composition where existing services are composed to create new services. The selection of services with the aim to fulfil the quality constraints becomes critical and challenging to the success of SBSs, especially when the quality constraints are stringent. However, none of the existing approaches for quality-aware service composition has sufficiently considered the following two critical issues to increase the success rate of finding a solution: 1) the complementarities between services; and 2) the competition among service providers. This paper proposes a novel approach called combinatorial auction for service selection (CASS) to support effective and efficient service selection for SBSs based on combinatorial auction. In CASS, service providers can bid for combinations of services and apply discounts or premiums to their offers for the multi-dimensional quality of the services. Based on received bids, CASS attempts to find a solution that achieves the SBS owner’s optimization goal while fulfilling all quality constraints for the SBS. When a solution cannot be found based on current bids, the auction iterates so that service providers can improve their bids to increase their chances of winning.
    Keywords: Combinatorial auction, Quality of service, Service composition, Service selection, Trust.

  • Computational Modelling of Cerebellum Granule Neuron Temporal Responses for Auditory and Visual Stimuli
    by Arathi Rajendran, Asha Vijayan, Chaitanya Medini, Bipin Nair, Shyam Diwakar 
    Abstract: Sensorimotor signals from cerebral cortex modulate the pattern generating metaheuristic capabilities of cerebellum. To better understand the functional integration of multisensory information by single granule neurons and the role of multimodal information in motor guidance of cerebellum, we have modelled granular layer microcircuit in the cerebellum and analysed the encoding of information during auditory and visual stimuli. A multi-compartmental granule neuron model comprising of excitatory and inhibitory synapses was used and in vivo like behaviour was modelled with short and long bursts. The change in intrinsic parameters in the model helped to quantify the effect of spike-time dependent plasticity in the firing of granule neurons. Computer simulations implicate coding correlation of output patterns to temporal excitatory stimuli. We observed the role of induced plasticity and granular layer role in sparse recoding of auditory and visual inputs and the model predict how plasticity mechanisms affect the average amount of information transmitted through the single granule neurons during multimodal stimuli.
    Keywords: Cerebellum; Computational Neuroscience; Auditory; Visual; Plasticity; Sparse Coding.

  • Resource discovery in inter-cloud environment: A Review
    by Mekhla Sharma, Ankur Gupta, Jaiteg Singh 
    Abstract: The Inter-cloud has emerged as a logical evolution to cloud computing extending computational scale and geographic boundaries through collaboration across individual Cloud Service Providers (CSPs). Resource discovery in this large-scale, distributed and highly heterogeneous environment remains a fundamental challenge to enable effective cross-utilization of resources and services. This review paper examines various resource discovery approaches in the inter-cloud outlining key challenges. Finally, the paper presents some ideas to build effective and efficient resource discovery strategies for the inter-cloud.
    Keywords: inter-cloud resource discovery, inter-cloud challenges, resource discovery challenges, resource discovery approaches.

Special Issue on: Green Mobile Computing for Energy-Efficient Next-Generation Wireless Communication

  • Non-linear Channel Tracking of a High Mobility Wireless Communication System
    by Sudheesh P, Jayakumar M 
    Abstract: Recently evolved wireless communication systems incorporate the use of Multiple Input Multiple Output (MIMO) systems to overcome the effects of channel fading. Orthogonal Frequency Division Multiplexing (OFDM) is moreover used to overcome Inter-Symbol Interference (ISI) to ensure effective signal transmission. The channel parameters in wireless communication systems are generally non-linear. Channel estimation techniques for non-linear systems include Unscented Kalman Filter (UKF), Kalman Filter (KF) and Extended Kalman Filter (EKF). The Kalman filter is used for linear channel estimation whereas the EKF and UKF are applicable for non-linear systems as well. Particle filter is a type of Sequential Monte Carlo (SMC) method which uses Sequential Importance Sampling (SIS) technique to effectively track a non-linear system. Particle filter (PF) is an efficient method of tracking, which is able to deal with non-Gaussian and non-linear systems. In this paper, we estimate the channel parameters of a fast time varying MIMO-OFDM system using particle filter. The proposed scheme considers a first order Auto-Regressive (AR) system model. A Rayleigh fading channel for mobile systems which incorporates the Doppler shift that occurs in a mobile environment is used. The performance of the particle filter is compared with the other estimation methods like Kalman filter and extended Kalman filter. The mean square error (MSE) as a function of the signal to noise ratio (SNR) is plotted to compare the performance of the particle filter with other systems.
    Keywords: Non-linear channel estimation; MIMO-OFDM system; Kalman Filter (KF); Extended Kalman Filter (EKF); Particle filter (PF).

  • Securing Ad Hoc Networks using Energy Efficient and Distributed Trust based Intrusion Detection System
    by Deepika Kukreja, S.K. Dhurandher, B.V.R. Reddy 
    Abstract: Mobile Ad Hoc Networks (MANETs) are subject to broad variety of attacks.Black hole and gray hole attacks are security threats that make MANETs weak by inducing packet forwarding misbehavior. This paper proposes a method for detection & isolation of malicious nodes and selection of most reliable path for routing data. Intrusion Detection System (IDS) is utilized to catch the nodes exhibiting packet forwarding misbehavior. Monitoring scheme is appropriate for MANETs as it emphasis on energy reduction, has distributed nature and compliant with dynamic network topology. Proposed method is simulated using network simulator NS2. Findings show that the proposed system is efficient in terms of Packet Delivery Ratio (PDR), Routing Packet Overhead, End to End Delay and Energy management as compared to Dynamic Source Routing (DSR) protocol and other protocols in this area. The protocol improves the PDR by 43.44% as compared to DSR protocol in presence of malicious nodes.
    Keywords: Ad Hoc Networks; Dynamic Source Routing Protocol; Intrusion Detection System; Trust; Gray hole attack; Energy.

  • Contribution to Radio Resource Distribution approach in Wireless Cellular Software Defined Networking
    by Fall Hachim, Ouadoudi Zytoune, Mohamed Yahyai 
    Abstract: We witness actually huge wireless traffic demand on a limited bandwidth. This leads to develop complex and power-hungry network technologies that are often harder to manage. Thus, some core network features as Radio Resource Management (RRM) introduce important issues as scalability and energy efficiency. This paper debates on next generation wireless cellular network Radio Resource Distribution (RRD) algorithms. We leverage Software Defined Network (SDN) benefits by proposing AoD (Algorithms on Demand), which aggregates several schedulers at the network controller. Based on Markov prediction, a real time context data analysis adapts the most suited RRD scheme at the evolved Node B. This choice depends on cell status (load, interference, etc.), thanks to the device programmability feature of SDN. Moreover, AoD reduces power consumption by optimizing always the transmission rate. Simulations show that one can approach 5G (fifth generation) radio policies by AoD theory with Quality of Experience and low carbon footprint as benefits.
    Keywords: Terms: Energy Efficiency, Markov Model Prediction, Openness, Radio Resource Management, Software-Defined Networking.

    by Vimal Kumar Stephen K, Mathivanan V 
    Abstract: With the effect of technological development, the primary objective of this research aims at retaining the energy level of the sensor node for a long period in the wireless sensor network. Ensuring negligible energy drop leads to long life for the network. Secure group key management technique is imposed to solve the security problem such as authentication, confidentiality and scalability. Cluster key and Master key is exclusively used in the network to protect the sensed information while communication between nodes takes place. Static and movable mobile sinks are deployed to enhance the lifetimes of the sensors in the network. Initially, the static mobile sinks act as a trusted third party for computing and distributing keys between sensor nodes and the clusters. Further, movable sinks are used to receive sensed data from the sensor where it is being located which avoids unnecessary event of choosing new cluster head often. The energy is retained, since the presence of trusted third party sink performs all the computations of cluster head. Computation is reduced in cluster head thereby increases the life time of the particular cluster. Outcomes of experiments prove that the suggested technique produced better results compared to related study.
    Keywords: Key Generation; Cluster key; Master key.
    DOI: 10.1504/IJAIP.2018.10006968
  • Vanet Routing Protocol with traffic aware approach   Order a copy of this article
    by Sangeetha Francis 
    Abstract: Vehicular Ad hoc NETwork (VANET), is a type of Mobile Ad hoc NETwork (MANET) that forms vehicles as nodes. Routing is the basic fundamental requirement of VANET applications. Therefore it is necessary to devise a routing protocol that fits well for rapid topology changes and disconnected network conditions. To address these specific needs of VANET we present a novel greedy routing protocol for vehicular networks called VRPTA that suite well for both city environment and the high way environment. With the help of localization system named GPS (Global Positioning System), the proposed protocol is designed to efficiently relay the data in the network by considering different scenarios like road traffic variation and various environment characteristics. The protocol communicates in between vehicles as well as vehicle to infrastructure whichever is applicable, thereby ensuring reliable transmission. In addition, we also consider the information about vehicles speed, direction and density of a city traffic configuration consisting of double direction roads, multi lanes and highway scenario. The work is implemented using NS2 simulator.
    Keywords: VANET,Routing.

  • Real Time MAF Based Multi Level Access Restriction Approach for Collaborative Environment Using Ontology   Order a copy of this article
    by Rajeswari Sampath 
    Abstract: The collaborative environment encourages rapid development in many organizations but struggles with malicious access. There are many access control approaches for improving the performance of the collaborative environment. There have been discussed earlier, but unfortunately performance is not seen. This paper presents a novel real time malicious access frequency based multi level restriction scheme. The method maintains the ontology of resources which contain data of various kinds, their properties and the set of roles of environment could get access to. Also, the system maintains the logs about the previous access of various users of the environment. The log, helps computation of the method for the requested data and user by MAF. Using computed MAF value the method computes the multi attribute trust measure for each level and also the multi level trust weight. Based on computed value, the method performs access restriction to improve the quality of collaborative development.
    Keywords: Collaborative Environment; MAF; Data Ontology; Access Restriction; Public Auditing; MLA.

  • Reconfigurable Communication Wrapper for QOS Demand for Network On Chip   Order a copy of this article
    by S. Beulah Hemalatha, Vigneswaran T 
    Abstract: Efficient Communication wrapper design is one of the important research issue in network on chip. A single wrapper with fixed design parameter will not be efficient one for the heterogeneous environmental network on chip scenario. The system on chip has many different computing and communication blocks with different data rate and data format. To interconnect such a heteronymous blocks a standard based wrapper frame work such as OCI wrapper are proposed .But such standard wrappers does not support the QOS demand of the every block .So this work proposes a frame work of reconfigurable communication wrapper design with support of QOS. The proposed frame work is simulated in LABVIEW software and tested on National Instruments FlexRIO 7845R FPGA hardware. The results shows that on the fly re-configurability is achievable with the frame work .
    Keywords: OCI wrapper; reconfigurable communication wrapper; QOS; system on chip.

  • MMSI: A Multi-Mode Service Invocation Algorithm To Improve The Connectivity In Accessing Cloud Services In Heterogeneous Mobile Cloud   Order a copy of this article
    by R.K. Nadesh, M. Aramudhan 
    Abstract: Modern research in cloud environment is focused on where mobile users can access data through cloud services through arrangement of regional cloudlets when the connectivity with the cloud service provider is less or lost. As the cloud services can be activated anytime from anywhere, connection management should be handled in a fair manner to maintain the service requirements. However, cloud services can be invoked independent of location, if the service parameters do not meet the constraints, then the performance of the cloud system degrades. In this paper, we propose a multimode service invocation algorithm for improving cloud service to the mobile users. When a mobile user is connected to a cloud service and the service level lowers on random mobility, this algorithm is used for choosing the cloudlet or an adhoc cloud to provide identical service without any interruption. In our experiment we estimate parameters like delay, signal strength and energy. Based on the estimated levels and the value less than the threshold value, we invoke and bind with the nearest cloudlet or adhoc cloud whichever is possible. The client invokes the services through cellular networks in normal condition and every time interval, it computes the signal strength, energy level and delay factors in accessing the cloud service. When the estimated parameters are less than the threshold, it is connected with the local access point. The multi-mode algorithm computes the service invocation weight and selects the connectivity mode in continuing the service invocation. We prove that, this algorithm improves the performances of the user in accessing cloud services in terms of throughput, connectivity ratio and service completion.
    Keywords: Cloud Computing; Mobile Adhoc Clouds; Cloudlets; Service Invocation.
    DOI: 10.1504/IJAIP.2018.10005659
  • Malicious node detection through Run Time Self healing algorithm in WSN   Order a copy of this article
    by B.R. Tapasbapu, L.C. Siddanna Gowd 
    Abstract: Wireless Sensor Network possess a large number of randomly deployed nodes. These nodes configure them self to form a network. WSNs major role is to monitor the environment, collect the data and communicate the data to the base node. The originality of the data communicated by the WSN nodes is important criteria to avoid the failure in the network. So Self-healing techniques are implemented to overcome the losses of data in routing due to misbehaving nodes. However, major protocols designed for self-healing are not energy constrained and not suitable for battery powered network. We here propose a new Run-time self-healing algorithm which posses individual monitoring nodes which scan the data and asses the stability of the nodes to ensure proper communication in the network. The proposed method was compared with self-healing hybrid sensor network architecture(SASHA) and Error Correction Code(ECC) algorithm to prove the improvement in efficiency of the network.
    Keywords: Wireless sensor network; Fault Occurrence; Self healing; Nodes management; Dead node avoidence.

  • Classification of Neonatal Epileptic Seizures using Support Vector Machine   Order a copy of this article
    by Vimala Velayutham 
    Abstract: Neonates are infants who are in their first 28 days of life. The diagnoses of neonatal seizures have been advocated by the use of clinical observations and electroencephalography (EEG). The continuous monitoring of neonatal EEGs in neonatal intensive care units is tedious and involves experts intervention. The use of clinical decision support systems into the neonatal intensive care units has proved to produce aid to neonatal staff. The neonatal seizures of epileptic origin are more common and we recommend an approach to aid in the classification of the same using EEG signals of the neonates. Daubechies wavelet transform is used for the task of separation of frequency bands and the extraction of features. The theta rhythm of EEG reflects rightly the occurrence of epileptic seizures in neonates. The features taken into consideration for the classification are mean, variance, skewness and kurtosis. The Support Vector Machine (SVM) based classification is adopted for the development of the system which detects the presence or absence of epileptic seizures. The performance of this diagnostic aid system has been studied and the system has a sensitivity of 94% and specificity of 96%. The receiver operating characteristic curve is also used in the performance assessment.
    Keywords: Classification; EEG; neonatal intensive care units; neonatal epileptic seizures; support vector machine.

  • A Novel approach for Secured Transmission of DICOM Images   Order a copy of this article
    by Priya Selvaraj 
    Abstract: Abstract DICOM communication(Digital Imaging and Communications in Medicine) mainly focuses on the transmission of medical images, storing the information in medical image and also for printing and securing the image. A medical image communication is mainly for secured medical facilities for the physicians and the patient. The medical image is compressed under the JPEG 2000 format. The hash value is find out using Additive Hash Function(AHF) and it is encrypted using RSA to form the digital signature. Combination of digital signature and text will be the watermark. This text consists of patient information, doctor information, disease information,and prescription. Reversible watermarking is a technique in which watermark is embedded and watermarked images passes through the authentication process, the original image is extracted along with the watermark.Strict authentication is provided in order to have high security for accessing the secure medical images by implementing Kerberos technique.
    Keywords: Keywords—Reversible watermarking; Authentication; Medical Image Compression; JPEG2000 Compression; Additive Hash Function; RSA; Kerberos.

    by Kalyanaraman Ramkumar, Gurusamy Gunasekaran 
    Abstract: Cloud computing is a developing technology in distributed computing which provides pay service model as per user need and requirement. Cloud includes the collection of virtual machines which have both computational and storage facility. The objective of cloud computing is to provide effective process to hyper distributed resources. Recently, cloud system is developing fast, and faces many challenges, two of them is scheduling process and other main challenge is security. Scheduling states about how scheduler adapts its scheduling strategy, according to the changing set of morals to control the order of work to be implemented by a computer system. In this research paper, a scheduling algorithm of collocate First Come First Server (FCFS) of supremacy elements is proposed where the system efficiency is improved using FCFS in parallel manner. To address security problem, crisscross Advance Encryption Standard (AES) is proposed by increasing the security in the cloud through the grid manner. Aggregate of this proposed work is to enhance the system efficiency and security by using the both crisscross AES and collocate FCFS of supremacy elements.
    Keywords: Cloud computing; First Come First Server; Advance Encryption Standard; Security;.

    by Jagarlamudi Ravisankar, B. Seetha Ramanjaneyulu 
    Abstract: A huge shortcoming of Orthogonal Frequency Division Multiplexing (OFDM) is the extreme Peak-to-Average Power Ratio (PAPR) of the transmitted signals. Partial transmit sequence (PTS) method is capable of enhancing PAPR statistics of OFDM signals. In PTS method, data block to be forwarded is split into disjointed sub-blocks and the subblocks are merged through usage of phase factors for minimizing PAPR. Because generic PTS needs extensive search over every combination of permitted phase factors, search complexity rises in an exponential manner with quantity of subblocks. In the current work, a novel sub-optimal technique on the basis of Binary Honey Bee Mating (BHBM-PTS) protocol is suggested for searching better combination of phase factors. BHBM-PTS protocol may considerably decrease computation complexity for bigger PTS sub-blocks and provides lesser PAPR simultaneously. Simulations prove that BHBM-PTS protocol is an effective technique for achieving considerable PAPR decrease.
    Keywords: Orthogonal Frequency Division Multiplexing (OFDM); Peak-to-Average Power Ratio (PAPR); Partial transmit sequence (PTS); Binary Honey Bee Mating (BHBM).

  • A Survey on Internet of Vehicles: Applications, Technologies, Challenges and Opportunities   Order a copy of this article
    by Priyan M K, Ushadevi G 
    Abstract: This work aims to provide a survey on Internet of Things (IoT), Internet of Vehicles (IoV) and Internet of Everything (IoE). The Internet of Things (IoT) provides interconnection between various physical devices such as sensors devices, mobile phones, laptop, PDA and so on. Nowadays, IoT also enables connection between vehicles, buildings and other items that are fixed with sensors, actuators and gateways. Internet of Vehicles (IoV) is identified from the Internet of Things (IoT). Internet of Vehicles (IoV) is used to make an interconnection between the things, vehicles and environments to transfer the data and information between the networks. Internet of Everything is an enhanced version of Internet based technologies such as Internet of Things, Internet of Humans and Internet of Digital. IoE provides end to end connectivity among procedures, knowledge and ideas engaged across all connectivity use cases. This paper discusses various challenges and issues in modern IoT, IoV, and IoE system. In addition, this paper also discusses security issues and various application of IoT in healthcare. Though, IoT devices are used in modern applications with good performance, however, some challenges are still exist. In order to overcome this issues, various open research problems are identified in this paper.
    Keywords: Internet of Things; Internet of Vehicles; Internet of Everything; Vehicular ad hoc network; Big Data; Cloud Computing; Intelligent Transportation System.

  • Radio Spectrum Collision avoidness in Cluster Cognitive Network through gazer Nodes   Order a copy of this article
    by V. Nagaraju, L.C. Siddanna Gowd 
    Abstract: The spectrum deficiency in Cognitive Radio can be solved effectively by utilization of radio spectrum. The spectrum is not effectively shared among all the other users. Since the users are spread across different locations the spectrum allocation and spectrum sharing is important to use spectrum effectively and to allocate communication channel to all the devices in the network, by doing so all the nodes in the network can communicate covering large area. In Cognitive Radio, Spectrum sensing, spectrum allocation, and reuse scenarios approaches with the different algorithm help improve the utilization of the spectrum. Traditional spectrum allocation technique such as fuzzy logic and harmony search replaces the spectrum with the new spectrum scheme. However, the new technique brings more efficiency in achieving spectrum utilization. Still the cognitive in mesh network has the problem of collision between the secondary and primary users. To minimize the effect of collision we introduce a gazer based cognitive radio network (GCRN) which provides more freedom for frequency sharing paradigm. The novel algorithm provides the network to adopt automatically for every change in the environment of the cluster in cognitive radio network.
    Keywords: Cognitive radio network; Gazer nodes; Spectrum Sensing; Resource Sharing; Control channel.

  • Intelligent Intrusion Detection Techniques for Secure Communications in Wireless Networks: A Survey   Order a copy of this article
    by Rama Prabha Krishnamoorthy PakkirisamyPAKKIRISAMY, N. Jayanthi 
    Abstract: Communication is a heart of the day to day activity in the current world. Since the world has practiced with electronic devices for carrying out all the daily activities, electronic and wireless communication along with internet plays a major role in the success of providing a sophisticated life.Moreover, the internet users are also gradually increasing in the recent two decades for making their life easy through fast communication.In such a scenario, the numbers of intruders are increasing in the Internet dramatically.In this paper, we provide a survey on the use of machine learning algorithms for developing intelligent intrusion detection systems which are most useful for providing secure communication in wireless networks. Moreover, we compare all the important intelligent intrusion detection systems based on their performance and also suggest some new ideas for improving the decision accuracy of current intelligent intrusion detection systems.
    Keywords: Intrusion Detection System; Machine Learning Algorithms; Pre-processing; Classification; Wireless Networks;.
    DOI: 10.1504/IJAIP.2018.10005732
  • Perlustration on existing techniques and applications in cloud computing for smart buildings using IoT   Order a copy of this article
    by D. Shiny Irene, T. Sethukarasi 
    Abstract: One of the emerging applications of IoT and its devices is to design and build smart devices for smart buildings. Though one of the design issues of smart devices such as anytime anywhere presence is achieved, there is a dearth need to ensure another challenging design issue viz., security ,interoperability and energy efficiency.. There are many emerging algorithms and techniques to address this issue. An attempt has been made in this paper to survey the emerging and optimistic algorithms that can address this ever dynamic issue in building smart cities using IoT. Energy efficient and environment friendly secured smart devices can be designed and developed in future to build perseverant smarter cities.
    Keywords: Internet of Things; Smart Buildings; Smart Energy and Security; Cloud Computing.
    DOI: 10.1504/IJAIP.2018.10006419
    by Naga Ravikiran Desiraju, Dethe C G 
    Abstract: Wireless sensor network (WSN) brings an innovative model with embedded system with restrictions of computing ability, intercommunication, storage capacity, and energy resource which is applied for high range of applications in the situations when constructing the network based on conventional infrastructure is not feasible. Clustering with WSN is a successful technique to reduce the rate of energy use of sensor node. The fuzzy logic calculates the Cluster Head (CH) selection probability depending on the nodes earlier communication history to choose the CH. The set of rules applied to the fuzzified input is the fuzzy rule base. The output of the inference engine is changed to crisp output by defuzzification. Artificial Bee Colony (ABC), an optimization protocol owes its inspiration to the exploration behavior of honey bees. It is a comparatively innovative optimization algorithm which has proven to be on par with classical bio-inspired protocols. This work on ABC optimization algorithm is suggested for selecting fuzzy rules. Rule selection methods combine different rules from fuzzy rule set to decrease the rules while maintaining the performance of the system. The rules that decrease the performance of the system are removed, to get a fuzzy rule set with improved performance.
    Keywords: Wireless sensor networks (WSN); Clustering; Artificial Bee Colony (ABC); fuzzy rule selection.

  • Image Encryption Techniques for Data Transmission in Networks: A Survey   Order a copy of this article
    by JAYANTHI RAMASAMY, John Singh K 
    Abstract: Todays the rapid growth of communication technology like internet, satellite, ground communications and mobile networks, resulted as the need to product the important information from the individual or general as well as their respective data against attackers. In this scenario, the issues of privacy, integrity, productivity, confidentiality and authenticity of images have become the significant issue for storage and communication of images. The encryption method is the best way towards maintaining the safety of transmitted data by transforming the information into an inconceivable form. In the past, the various encryption methods were proposed and applied towards product the trustworthy images from the unauthorized users. This study discussed, analyzed and identified the issues from the previous encryption methods. This paper discussed about the various encryption methods and reviewed the related works for each scheme. Finally, this study discussed the purpose of this image encryption technique in future.
    Keywords: Image encryption; steganography; cryptography; Color image encryption; image quality measure; security analysis; Cryptoanalysis.

    by S. Thaiyalnayaki, J. Sasikala, R. Ponraj 
    Abstract: Search engines involve the important role between the users thinking and visual images.Digital images are easy to manipulate and modify due to the powerful tools of image processing technique .but still the challenge tasks are matching slightly altered copies to their original appearance , which is termed near-duplicate image detection.Web image search results nowadays have a significant portion of near duplicates with images varying in size and resolution. However, since these images refer to the same or similar image, most search engines groups them in their result pages. The definition of a near duplicate image varies depending on what resolution and geometric variations are deemed acceptable.Near-duplicate (ND) image detection appears as a timely issue recently, being regarded as a powerful tool functionality various emerging applications. copyright enforcement, news topic tracking, image and video search are the tasks enables by the identification of near-duplicate image. In the paper, a method has proposed for Indexing Near-Duplicate Images using segmented minhash algorithm. First image enhancement is done based on user query image then features are extracted.SURF (Speeded up Robust Features) is used for extract the local invariant features of each web images. After this We introducing new algorithm called segmented minhash which is used for similarity is calculated among the feature extracted images. Finally,indexing near duplicate images and exact duplicate image based on user query.For indexing we use Locality Sensitive Hashing (LSH). We demonstrate that our proposed approach is extremely effective for collections of web images.
    Keywords: Indexing,near-duplicates; near-duplicate detection; Image Enhancement.

  • Adaptive Multi loop IMC Based PID controller tuning using Bat Optimization algorithm for Two Interacting Conical Tank Process   Order a copy of this article
    by Lakshmanaprabu Sk 
    Abstract: In this paper, multi loop adaptive internal model controller (IMC) based PID is designed for the two interacting conical tank level process (TICTLP). The nonlinear TICTLP is decomposed into linear transfer function matrix around the operating points and the effective open loop transfer function (EOTF) is developed using simplified decoupler. The IMC based PID controller parameters are obtained for EOTF model using Bat optimization algorithm (BOA). A weighted sum of integral time absolute error is used as control design objective function for multi-loop IMC-PID design which yields faster settling time with minimum over shoot. The fuzzy based adaptive gain scheduling is used to provide complete control to TICTLP and fuzzy based adaptive decoupler is implemented to eliminate the dynamic interaction between control loops. The simulation results of proposed controller are compared with conventional ZN-PID, IMC controllers to show the superiority of proposed controller. The simulation response of proposed controller indicates the performance improvement control schemes interns of time domain performance indices, servo tracking, regulatory response and faster settling time.
    Keywords: Conical tank process; Effective open loop transfer function; adaptive decoupler; Multi loop IMC control; IMC-PID; Relative Gain Array (RGA); Fuzzy gain scheduling; Bat Optimization Algorithm.

    by Ramana Rao M V, Adilakshmi T 
    Abstract: Energy can be efficiently conserved in WSN through clustering of nodes. As in all shared-medium networks, Medium Access Control (MAC) protocol enables the smooth functioning of the network. An important function of MAC is to prevent the bottle-neck between two nodes sending data simultaneously. Many MAC protocols have been developed for smooth functioning of WSN which includes Berkley Medium Access Control (BMAC) which utilizes minimal power listening as well as a proper preamble for minimal power communication. The main challenge of BMAC is overhearing and power wasting in long preambles. The aim of this work is to cluster BMA) protocol using heuristic methods based on River Formation Dynamics (RFD) and Particle Swarm Optimization (PSO). The suggested protocols performance is evaluated for Packet Delivery Ratio (PDR), end to end delay, hop as well as jitter. The outcome shows that the proposed River PSO cluster BMAC performs better than BMAC with flooding and BMAC with cluster based routing when compared with static and varying node mobility.
    Keywords: Wireless Sensor Networks (WSN); Cluster Head (CH); Medium Access Control (MAC); River Formation Dynamics (RFD); Particle Swarm Optimization (PSO).

  • An Adaptive Low Power Coding Scheme for the NOC   Order a copy of this article
    by Jasmin M., Vigneswaran T. 
    Abstract: Low power system design is important for system on chip design where many sub system blocks communicate with each other with higher data rate to realize the system functionality. Low power coding either will reduce energy by reducing self-switching activity or reduces energy consumption by reducing coupling switching activity. But under typical Network on Chip (NOC) system we require a low power coding scheme which has to handle different kinds of data traffic from different IP core at different instant and different places in System on Chip (SOC). A single low power coding scheme will not solve all the subsystem or application demands. So here in this paper a correlation analysis based adaptive data coding scheme is presented which will provide low power at any instant on any kind of data traffic. This is done by selecting and encoding the data with different coding scheme based on correlation level of the data traffic. The data traffic is classified into three categories as low correlated data traffic, moderate correlated data traffic and high correlated data traffic. Based on the classification different coding scheme is applied .The proposed system is simulated in labVIEW FPGA tool for the USRP RIO target which is a wireless transceiver that can inject megabits of test data per second for testing the coding scheme .The power consumption of the existing coding schemes are compared with the proposed adaptive scheme by taking different correlation based test data sets. The result shows that the proposed system will save 25% energy compared to other coding scheme at the worst case scenario.
    Keywords: NOC;SOC;Correlation analysis;USRP RIO .

  • Severity of defect: An optimized prediction   Order a copy of this article
    by Kiran Kumar Reddi, Achuta Rao S. V. 
    Abstract: To assure the quality of software an important activity is performed namely Software Defect Prediction (SDP). Historical databases are used to detect software defects using different machine learning techniques. By doing so, there is increased potential with positive outcome. Conversely, there are disadvantages like testing becomes expensive, poor quality and so the product is unreliable for use .A bug report illustrates the severity of a defective code. The resources for testing and other planning activities are done based on Defect severity assessment. This paper classifies the severity of defects by using a method based on optimized Neural Network (NN).The above method is based on Shuffled Frog algorithm and the experimental outputs reveal that it can do better than Leven berg Marquardt based NN system (LM-NN).
    Keywords: Software defect prediction (SDP); Severity; Neural Network; Levenberg Marquardt (LM); Shuffled Frog; fuzzy classifier.

  • High-level optimized systems design using hardware-software partitioning   Order a copy of this article
    by Lilia Kechiche, Lamjed Touil, Bouraoui Ouni 
    Abstract: Embedded systems have a wide range of use and have become essential parts of todays life. A typical embedded system consists of application-specific hardware and programmable software. Hardware-Software (HW/SW) partitioning problem plays a crucial role in embedded systems design as it allows the proposition of an optimized system with predefined constraints. It allows choosing which tasks should be mapped to software and hardware. In this paper, a heuristic algorithm, the hybrid-bee-colony-optimization for multiple-choice HW/SW partitioning is proposed with the objective of minimizing power consumption and execution time, while meeting area constraint. The heuristic algorithm is developed to generate an approximate solution in acceptable delay. The Virtex 5 is chosen as a target platform. Simulation results are compared with existing works and they show rapidity with the generation of an optimal solution near to the exact one.
    Keywords: hardware-software partitioning; heuristic algorithm; bee-colony optimization; SOPC.

    by Madhusudhanan Baskaran, Chitra S, Anbuchelian S 
    Abstract: Recently, a lot of attention paid to the domain of sentiment analysis (SA), with experts acknowledging the scientific trials as well as possible applications of the processing of subjective language. SA is the computational analysis of opinions or sentiments conveyed in a body of text. The aim of SA is the detection of subjective data present in several sources and figure out the attitudes of the author regarding the topic. In the current study, the feature extraction is carried out Term frequency / Inverse document frequency and features selection through CMIM. Feature classification is done through LogitBoost, CHAID as well as k-Nearest Neighbor classifiers. The experimental results were contrasted with one another.
    Keywords: Sentiment Analysis; LogitBoost; CHAID; CMIM; k-Nearest Neighbor (kNN); Term frequency / Inverse document frequency; Stemming; Stop words.

    by Aswin Seshadri K, Thulasi Bai V 
    Abstract: In the recent past many laboratories explored the prospects of communication through cerebral activity for patients with neuromuscular disorders. A Brain-Computer Interface (BCI) enables control of devices or communication with brain activity without using muscles. It has been successfully used in scientific, therapeutic applications and helps increase the patients standard of life. Electroencephalography (EEG) recorded from a persons scalp is used for controlling the BCI. EEG signal analysis and classification is one of the prominent researches in the field of BCI. The major challenges of BCI are low signal-to-noise ratio of neural signals, and need of robustness of extracting feature set from the brain signals and classifying it. In this work, we review a data fusion techniques for EEG-based BCI along with Bayesian methods for BCI. This paper provides a comparison of the feature extraction techniques - Laplacian, kalman and fused Laplacian-kalman. The features obtained were classified using Naive Bayes classifier. Source identification and spatial noise reduction is achieved through the surface Laplacian. The two functions of surface Laplacian are associated with prediction accuracy as well as signal orthogonality in BCI.
    Keywords: Brain–Computer Interface (BCI); Feature Extraction; Laplacian; Kalman Filter and Naïve Bayes Classifier.

    by Thiyagarajan Venkatesh 
    Abstract: Global Positioning System (GPS) satellites produce low power signals that travel great distances to reach the receiver. To negate a GPS system, an adversary needs only to generate jamming signal with enough power and suitable temporal or spectral signature to deny the use of GPS throughout a given area. The first system developed to increase the GPS anti-jam capability for users on the ground or in the air was controlled reception pattern antenna. This device consists of an array of antenna elements. The elements are all connected to the electronics box that controls either the phase or gain or both and combines them to give a single output. From both military and civilian perspective it is important to establish an adequate anti-jamming capability for GPS systems and ensure availability of this asset in all environments. This was recognized by the military and resulted in the development of several mitigation techniques in time domain, time-frequency domain, Adaptive Antenna Arrays (AAA) and PC based software defined radio concepts. In this study, circular geometry of 5 patch antennas operating at L2=1.227GHz are designed and fabricated. Phase only nulling technique based on hybrid optimization is proposed and evaluated using IE3D software.
    Keywords: Global Positioning System (GPS); anti-jam; Adaptive Antenna Arrays (AAA); Circular geometry; patch antennas; Phase only nulling; Artificial Bee Colony (ABC) algorithm; Cuckoo Search (CS).

  • Power Audit: An estimation model-based tool as a support for monitoring power consumption in a distributed network infrastructure   Order a copy of this article
    by Aziz Dahbi, Asmaa El Hannani, Abdelhak Aqqal, Abdelfatteh Haidine 
    Abstract: Understanding the details of power consumption in IT distributed infrastructure has become essential to make efficient power management decisions. Indeed, increasingly, energy costs are a major factor in the Total Cost of Ownership (TCO) of IT equipments of both data centers and enterprise computing. However, measuring and monitoring the power consumption of systems in medium-scale to large-scale distributed infrastructures is often difficult due to large and dispersed deployment of heterogeneous equipments such as personal computers (PCs), routers, switches, printers, etc. The various aspects discussed in this study are then organized around: i) proposing an approach for measuring power consumption of devices in distributed infrastructure, especially for computers as a first step, and ii) collecting the measures on the monitoring server through the network in a supervisory objective using Simple Network Management Protocol (SNMP). We have designed and developed software named "Power Audit" as support of the above aspects.
    Keywords: IT equipments; SNMP protocol; distributed infrastructure; power management; power consumption.

Special Issue on: Advances in Information Security, Privacy and Forensics of Multimedia Big Data in the Internet of Things

  • Fingerprinting Violating Machines with In-Memory Protocol Artifacts
    by Mohammed Al-Saleh, Yaser Jararweh 
    Abstract: Cyber crime has increased as a side effect of the dramatic growth in Internet deployment. Identifying machines that are responsible about crimes is a vital step in an attack investigation. Tracking the IP address of the attacker to its origin is indispensable. However, apart from finding the attacker's (possible) machine, it is inevitable to provide supportive proofs to bind the attack to the attacker's machine, rather than depending solely on the IP address of the attacker, which can be dynamic. This paper proposes to implant such supportive proofs by utilizing the internals of three well-known Internet protocols: IP, TCP, and ICMP. Our results show that there can be potential proofs in the structures of these protocols. In addition, because a violator is unaware of (and has no control over) the involved protocols, the investigation process is empowered with stealth. To the best of our knowledge, we are the first to utilize protocol remnants in fingerprinting violating machines.
    Keywords: Fingerprinting, violating machine, protocol artifacts

  • Enhancement of 3-D Playfair Algorithm using dual key
    by Arnab Kumar Das, Nabanita Das 
    Abstract: Playfair cipher is the one of the well known polyalphabetic cipher. In this paper we present a new approach f or secure transmission of a message by a modified version of the playfair cipher combining with ex-or operation and dual key. To develop this whole technique we used the three functions. One is generated the matrix and another two is encryption and decryption technique. The proposed extended 3d Playfair cipher is working with 256(4x8x8) characters, it selected 52 alphabets(upper case and lower case), 10 numerals and 194 most commonly used special characters of ASCII character set. We use the 3D Version of the playfair cipher but we use the digraph concept. The restrictions of existing 2D-Playfair ciphers and 3D-Playfair cipher using 4x4x4 matrices, 6x4x4 matrices are overcome in the proposed work. The proposed algorithm can accumulate more characters than the existing 3D- Playfair ciphers.
    Keywords: playfair, cipher, polyalphabetic, encryption, decryption, ASCII.

  • A Knowledgebase Insider Threat Mitigation Model in the Cloud: A Proactive Approach
    by Qutaibah Althebyan, Yaser Jararweh, Qussai Yaseen, Rami Mohawesh 
    Abstract: Security of cloud computing is a major concern for both organizations and individuals. Organizations are looking for more trust from individuals. At the same time cloud users want to make sure that their private data will be safe from disclosure either by outsiders of the cloud or even from (probably malicious) insiders of the cloud (cloud agents) from within the cloud. Hence, insiders' threats of the cloud computing is a major issue that needs to be tackled and resolved. In this paper, we propose a proactive insider threat model using a knowledgebase approach. Proactive in a sense that our model tries to detect (in advance) any deliberate deviation of the legal accesses an insider might try to perform so that the individuals private data will be protected and secured. At the same time the cloud resources will be insured to be secured as well as consistent at all times. Knowledgebase models were used earlier in preventing insider threats in both the system level and the database level. This knowledgebase work will be extended to cloud computing systems. The proposed model insures an in advance mitigation in the form of detection (and hence, a chance for prevention) of possible insider breaches. This mitigation correlates system insiders admins' knowledge who may grant undesired privileges to insiders of the underlying cloud data center. The proposed model handles the insider threat in a cloud data center at its several levels: the host level and the network level where insiders are categorized several levels of privileges according to their locations within the cloud data center. Simulation results show that the proposed model works well in predicting malicious acts of insiders of the cloud data center. It also shows that although our model is effective in predicting insiders' threats, it still performs well with minimum overhead to its performance. This in fact has been concluded by showing that the number of blocked insiders is reduced to the minimum.
    Keywords: Insider; Proactive; Cloud Data Center; Knowledgebase; Prediction; Mitigation

  • Botnet Detection based on DNS Traffic Similarity   Order a copy of this article
    by Ahmad Manasrah, Walaa Bani Domi, Nur Nadiyah Suppiah 
    Abstract: Despite the efforts in combating the threat of botnets, they still grow in size and evasion techniques. The bot software is written once and spreads to other machines all over the world. The bot software is preconfigured to locate the malicious domain name (if it is not static) through the DNS system, like any other legitimate host. In this paper, a scalable approach for detecting a group of bot hosts from their DNS traffic is proposed. The proposed approach leverages a signal processing technique, power spectral density (PSD) analysis, to discover the significant frequencies (i.e. periods) of the botnets periodic DNS queries. The proposed approach processes the timing information of the generated DNS queries, regardless of the number of queries or domain names. Measuring the level of similarity between hosts demonstrating periodic DNS queries should reveal the group of bot hosts in the monitored network. Finally, we evaluated the proposed approach using multiple DNS traces collected from different sources along with a real world botnet deployed under controlled environment. The evaluation result shows that the proposed approach was able to detect the group of bot hosts that demonstrates similar periodic DNS pattern with high accuracy and minimum false positives rates.
    Keywords: Botnet detection; Traffic similarity; Traffic anomaly; Group Activity; Malware activity; Traffic behavior analysis; Network Intrusion Detection.

Special Issue on: Intelligent Computation Systems

  • Study of Skin flow motion pattern using photoplethysmogram
    by Neelamshobha Nirala 
    Abstract: Microcirculatory dysfunction is related to many diseases and occurs long before their clinical manifestation. We used wavelet transform to study the microcirculatory regulatory mechanism in three different groups (18-diabetic, 8- peripheral arterial disease (PAD) and 14 healthy controls) using toe photoplethysmogram (PPG) and 11 different features were derived. Compared to healthy subjects we obtained a significant decrease in the neurogenic (VNe: 286.41 vs. 125.29(a. u), p-value=0.000), myogenic (VMe: 281.55 vs.29.02, p-value=0.000) and respiratory activity (VRe: 37.68 vs. 9.35, p-value=0.022) in the diabetic group and significant increase in the cardiac activity (VCe: 19.69 vs. 33.89, p-value=0.007) in PAD group. Result of linear multiple regressions analysis showed a significant negative association of age and BMI with myogenic activity (p-value=0.002, r-value=0.173) and neurogenic activity (p-value=0.036, r-value=0.375) respectively. Our study showed that PPG signal can be used as a non-invasive tool for studying the vasomotion impairment in the diabetic patient during resting condition.
    Keywords: Continuous Wavelet Transform, Laser Doppler flow meter, Photoplethysmogram, Microcirculation, Skin blood flow, Vasomotion.

  • Automated transformation of NL to OCL Constraints via SBVR   Order a copy of this article
    by Murali Mohanan 
    Abstract: This paper presents a neoteric method to automatically generate Object Constraint Language (OCL) Constraints from natural language (NL) statements. In Unified Modeling Language (UML) standards OCL is used to check whether a model follows given process or domain specific heuristics and also to improve the precision of model specifications. As constraints are the key components in the skeleton of business or software models one has to write constraints to semantically compliment business models or UML models. To support the software practitioners in using OCL we present a novel method. The aim of this method is to produce a framework so that the user of UML tool can write constraints and pre/post conditions in natural languge like English and the framework converts such natural language expressions to equivalent OCL statements. Here the state of art of the two well known technologies namely Open Natural Language Processing (OpenNLP) and Semantics of Business Vocabulary and Rules (SBVR) are used. OpenNLP is used as a preprocessing phase to process the natural language statements. Preprocessing includes sentence splitting, tokenization and parts of speech (POS) tagging. Then in the second phase i.e the transformation phase SBVR is used to automatically transform the preprocessed natural language statements to SBVR specifications. SBVR has major role in this transformation as it uses the syntax of natural language. The main aim of the research is to provide automated tool support for model processing tasks in UML models via SBVR to model transform the input SBVR specifications to OCL specifications as explained in Model Driven Architecture (MDA).
    Keywords: Natural language processing;SBVR;UML;OCL.

Special Issue on: ICONI 2015 Internet Computing and its Applications

  • A software defined networking-based resilient framework combined with power-efficient PS-LTE network   Order a copy of this article
    by Muhammad Afaq, Wang-Cheol Song, M.G. Kang 
    Abstract: Computer networks have an increasingly important societal role, requiring them to be resilient to a range of challenges. For a network to be resilient, it should be accompanied by a state-of-art monitoring system which should not only be tailored to the requirements of the network, but should also be able to provide real-time network-wide visibility. Besides resilience, a network should also be power-efficient. For this purpose, we propose to combine an IP-based resilient SDN framework with a power-efficient PS-LTE network. With this combination, data communication can still be made possible in case of disaster occurrence. In this paper, we aim to focus on (1) sFlow monitoring system that is required to make our SDN-based framework resilient against disasters, (2) power-efficient PS-LTE network. Our goal is also to trigger more profound discussion on combining SDN-based framework with power-efficient PS-LTE network.
    Keywords: SDN; PS-LTE; resilience; power-efficient; sFlow monitoring.
    DOI: 10.1504/IJAIP.2018.10006402
  • Media-aware Scheduling Method for Transmitting Signaling Message over MPEG Media Transport-based broadcast   Order a copy of this article
    by Yejin Sohn, Minju Cho, Jongho Paik 
    Abstract: The broadcasting system should send signaling messages frequently, and it is necessary because users randomly access to the broadcast service. However, overhead size caused by repeat transmission remains a big issue because of the limited bandwidth. To solve this problem, we propose a media-aware scheduling method of signaling messages for MPEG Media Transport (MMT)-based broadcasting. MMT recommends that the sending entity may send signaling messages at regular time intervals, but our method considers the media type when it sends messages. We compared and analyzed between two methods with various media encoding parameters which impact on overhead size. As a result, the proposed method not only maintained the latency time as similar as the MMTs proposal but also reduced overhead size.
    Keywords: : broadcasting system; scheduling method; signaling message; MPEG Media Transport; Scalable High Efficiency Video Coding; media-aware; media encoding; random access.

  • Pedagogical Agility, and Agile Methodologies, in Computer System Development Education   Order a copy of this article
    by Roy Morien 
    Abstract: The Agile Development debate has been won, and there is substantial evidence to support this contention. Agile Development can now be considered as the major, first preference software development method, and there is much research to support its effective use and efficient practices, as well as its widespread adoption in organizations. The battle that now must be won is the acceptance of agile development methods as an integral part of the systems development curriculum in colleges and universities. Agile Development in software system development can also be viewed as being part of a much larger context, which we can call organizational agility‟. The term Organizational Agility is not unknown in the web universe and in management literature. It basically means the ability of organization‟s to rapidly change or adapt in response to changes in the market. An associated concept is that of Lean Development which has been known and understood for many years, and the concepts and practices of which have, of particular interest here, been adapted to be Lean Software Development, a sub-set of Agile Development.\r\nThis paper therefore is best seen as an education paper, based on a research approach now understood as the teacher-researcher in the classroom‟. The author draws on 30 years of experience as a teaching academic to propose a radical approach to computer systems development pedagogy. It is considered to be now an imperative to include Agile Development in university and college educational curriculum. As well, it is proposed that the philosophy and practices inherent in Organizational Agility‟ and Lean Product Development‟ are adopted to inform the educational and pedagogical processes, particularly in the Teaching and Learning of computer system development courses, however styled; Information Systems, Information Technology, Business Computing, Computer Science.
    Keywords: Agile Development; Agile Adoption; Organizational Agility; Lean Education; Agile Education; Student Self-Assessment; Project Based Learning; teacher-based research.

  • A Framework for Collaborative Information Management in Construction industry   Order a copy of this article
    by Qusay Al-Maatouk, Mohd Shahizan Othman 
    Abstract: Majority of Architecture, Engineering and construction projects spent a considerable time collecting and analyzing related information throughout the execution of each single project activity. The flow of theses related information among project activities is usually more frequent than the work flow itself. Therefore, collaboration is vital to project success and considered as one of the causal success factors in project management and development. Teams with high levels of collaboration and coordination have been shown to be more effective. There is a global realization of how important to implement and integrate IT in the construction process in order to reduce cost and achieve more efficient projects. In the other hand, the ineffective use of IT in managing information exacerbates the amount of rework that occurs during many construction projects.
    Keywords: collaboration; cloud computing; Information management; Architecture Engineering and Construction.

  • An Automatic Detection of a Natural Marker and Augmentation of 3D Models in AR with Sketch-based Object Matching   Order a copy of this article
    by Junchul Chun, Jaejoon Seho 
    Abstract: This paper introduces a sketch-based localization approach to detect a desired natural marker from an input video image. The proposed method also retrieves a 3D virtual object to be augmented in Augmented Reality from a 3D database based on the object matching method. Sketchbased image matching has been used for content-based retrieval to compare the database images with a sketch-based image drawn by users and estimate the degree of similarity between the database images and the query image. In this paper, we adopt sketch-based object matching method to localize the natural marker of the video images to register a 3D virtual object in AR system. Most similar object in the input image is determined as a natural marker of the AR by comparing the user defined sketched image based on the basic features of the sketched object. Unlike other image matching methods, this matching technique is possible to produce query image without constraints by drawing the image intuitively. In addition, in the proposed sketched based AR system, the 3D object augmented on the marker will be also determined by object matching between the detected marker and 3D database images.
    Keywords: Augmented Reality; Sketch-based Image Matching; Object Matching; SURF; GrabCut Method; Local Binary Pattern; Natural Marker Detection.

  • Dynamic Spectrum Access for M2M-WANs: The African Regulators Spectrum Policy Reform Conundrum   Order a copy of this article
    by Luzango Mfupe, Fisseha Mekuria 
    Abstract: This paper presents work that has been done to address the network capacity demands for Internet of things (IOT) and Machine-to-Machine (M2M) communications, based on efficient management and utilization of radio spectrum resources. IOT/M2M applications are predicted to exponentially grow and cause a massive up-surge in network traffic increase. We argue that existing mobile network architectures are not optimized to handle billions of small intermittent transactions generated by M2M connections, therefore, a technology based on Dynamic Spectrum Access (DSA)-enabled low-power M2M- Wide Area Networks (WANs) is proposed. Subsequently, the article presents a use-case scenario demonstrating a possible deployment of a smart metering M2M-WAN network using TV white-space (TVWS) channels and geo-location spectrum database technique. The simulated experimental M2M-WAN showed that with only 4 TVWS channels an entire metropolitan city can be covered to provide smart-metering services. Furthermore, the article suggests changes in existing spectrum management policies and technical regulations to accommodate the new DSA-based technologies. Hence a cost-effective techno-regulatory-policy model is suggested to promote DSA-enabled low-power M2M-WANs.
    Keywords: IOT; M2M-WAN; TVWS; DSA; Smart-meter; Spectrum; Spectrum policy; Geo-location spectrum database; WSD; Spectrum regulator; Low-Power.

  • An Integrated framework for Posture Recognition   Order a copy of this article
    by Shipra Madan, Devpriya Soni, Harvinder  
    Abstract: Postures can be identified and then classified from the video sequences using scale invariant keys as classification features and the results of classification can be used in various fields like surveillance, medical diagnosis and training purpose. In this paper frames are extracted from the given video files and transformed into a large collection of local feature vectors using Scale Invariant Feature Transform (SIFT), each of which is not affected by image translation, scaling and rotation, and to some extent invariant to illumination changes and affine or 3D projection.Features are grouped using k-means clustering in which each posture belongs to the cluster with the nearest mean. Multiclass support vector machine i.e. Directed acyclic graph (DAGSVM) then assigns labels to the centers obtained from clustering. Adaboost is incorporated to boost the performance accuracy of the classifier. Dataset used in this study is Bharatnatyam video dataset. The posture classification model is also shown to outperform state-of-art classification systems on videos as classification accuracy achieved using this frame work is 89%.
    Keywords: Posture classification; Support Vector Machine (SVM); Scale Invariant Feature Transform(SIFT); k-means clustering.

  • Low-Illuminated SPOT-5 Image Improvement for Density-based Vegetation Identification using 3-Layer Color Manipulation Approach   Order a copy of this article
    by Nursyafikah Hamid, Hishammuddin Asmuni, Rohayanti Hassan, Razib M. Othman 
    Abstract: Poor illumination quality of a satellite image is one of the challenges encountered in vegetation analysis, especially with regard to pan-sharpened medium spatial resolution SPOT-5 imagery. Hence, the accuracy of vegetation identification will be affected. In this paper, a 3-Layer color manipulation approach is proposed to overcome this issue of low illuminated SPOT-5 images in order to increase the performance of precise vegetation identification. The SPOT-5 image is pre-processed and three layers of image enhancement techniques are used to, specifically: identify vegetation, reduce shadow appearance, as well as contrast enhancement for color uniformities in order to improve low illumination quality of images. These steps are then followed by a supervised classification process for density-based vegetation area discrimination. This research was tested using multispectral medium spatial resolution SPOT-5 imagery covering the Ramsar Convention site of Tanjung Piai located at the southernmost tip of mainland Asia over the years 2008, 2011 and 2013. The results showed that the proposed approach performed better than existing techniques when dealing with low-illuminated medium resolution multispectral imagery specifically with regard to density-based vegetation identification. The results are supported with accuracy assessments and ground truth validation.
    Keywords: low illumination; illumination enhancement; medium spatial resolution; vegetation identification; multispectral image.

  • A Study on the Security Impact of the Web Services Implementation in the Malaysian Governments Online Applications   Order a copy of this article
    by Weilin Chan, Mohammad Faidzul Nasrudin, Ibrahim Mohamed 
    Abstract: Over the years since its introduction, most organizations believe that web services could be the best solution to address security issues in online application services. One of the top security issues is to resolve existing threats and vulnerabilities. However, without proper configuration, web services may introduce new problems to the application environment without the concerns of the system developer. The purpose of this paper is to determine the relevant security factors and the degree of security of each factor provides when implementing web services in Malaysian governments online applications. The result from this study is a model of security level determinant factors with each factor colour coded base on the impact it has on security. This model consists of four core groups of factors were discovered, namely policies, expected vulnerabilities, security standards, quality of services and others. An additional 13 environmental factors that were found to be influenced the core factors in web services implementation. The classifications of these factors were based on the nature of business and code of conduct in public sector agencies. Factor groups assessed with an impact value between 2 and 3 require high attention and express action to be taken by the organization as the impact level is high and may affect the severity in their web services implementation. The model will assist administrators or decision makers determine which vital factors of the security factors require protections against a possible threat to the organization.
    Keywords: e-Government; web services security; online security policy; online applications; application vulnerabilities; security standards; quality of service.

  • Validated Agile Cost Management Success Factors in Software Development Projects.   Order a copy of this article
    by Zulkefli Mansor, Saadiah Yahya, Noor Habibah Arshad 
    Abstract: Effective and efficient in managing agile project costs is important in ensuring the project is successful. Therefore, the objective of this paper to validates the success factors which contributed to the success of agile cost management. This paper outlines eight key success factors such as customer engagement, changes in requirements, Communication, Corporate Culture, Time Allocation, Simplicity, Cost Effective Management Process and Selection computerized tool that contributes to an agile management costs. This study employed mixed method through questionnaires and interviews to collect data. The Rasch Measurement Model was used to analyze the results. The results showed that all eight factors contribute to the success of cost management agile. The results of this study can assist practitioners or academics in avoiding problems in managing the cost of the agile software development projects
    Keywords: Effective; Practitioners; Requirements; Culture; Time; Simplicity; Process; Tool.

  • Computer Forensic Problem of Sample Size in File Type Analysis   Order a copy of this article
    by Hassan Chizari, Shukor Abd Razak, Mojib Majidi, Shaharuddin Bin Salleh 
    Abstract: File Type Identification (FTI) is the problem of determining the file type from its content. FTI, as a computer forensic challenge, has been studied extensively with many solutions provided by researchers. One of the most popular methodologies to do so is the mathematical analysis, which examines the distribution of bytes to explore the file type (Byte Frequency Distribution (BFD) equations). The main question, which is left behind, is that how one can generalize his or her proposed FTI algorithm to all files? In this work, firstly, a normality assessment test has been applied for various BFDs equations, which showed none of the BFDs histogram is normal distribution. Then, using Renkonen correlation to compare non-normal distributions, the proper sample sizes, which is population representative, were presented based upon the file type and BFDs equations. Finally, it has been shown that using Bootstrap method the BFDs distribution can be converted into a normal distribution.
    Keywords: File Type Identification; Sample Size; Non-normal Distribution; Byte Frequency Distribution.

  • An Improved Data Pre-Processing Method for Classification and Insider Information Leakage Detection   Order a copy of this article
    by Sung-Sam Hong, Dong-Wook Kim, Myung-Mook Han 
    Abstract: Data pre-processing, a step performed prior to data processing, converts data into a form that is easy to analyze. In this study, we propose a method for the pre-processing and integration of data collected from various sources to detect insider information leakage; further, we evaluate the performance of data pre-processing by performing classification and detection experiments with collected normal and abnormal log data. An insider information leakage attack scenario was created, and the attack data for this scenario were generated in order to collect the corresponding log data. The log data in a normal environment were also collected. During the normalization of log data, the log was extracted as atypical data to normalized as mathematical model, and dimension reduction was performed on the high-dimension feature matrix. This pre-processing stage improved the efficiency of information leakage analysis and detection, as demonstrated by the results of our experiments. From the experimental results, we observe that securing the attack scenario and actual attack data is a very important factor in insider information leakage detection owing to the small amount of attack data. The results of classification can improve, depending on the number of classification categories and the amount of data. Therefore, it is important to secure existing data and to build a knowledge base. In addition, the experimental results have shown that the Naive-Bayes (NB) classifier and the Support Vector Machine (SVM) classifier have superior performance with accuracies of 0.9991 and 0.9997, respectively, in source classification.
    Keywords: Data Pre-Processing; Data Leakage Detection; Classification; Log Analysis; Information Security; Intelligent Security Data Analysis; Feature Extraction.

  • A method of improving PRR for WiFi Interference Avoidance in ZigBee Networks in Indoor Environments   Order a copy of this article
    by Youn-Sik Hong, Sung-Jae Kho, Uk-Jin Jang, Jae-Ho Lee 
    Abstract: This paper focuses on how to avoid RF interference when deploying WiFi and IEEE 802.15.4/ZigBee radios simultaneously or in close proximity in indoor environments. The circumstances are particularly unfavorable for ZigBee networks that share the 2.4 GHz ISM band with WiFi senders capable of 10 to 100 times higher transmission power. However, the nature of ZigBee devices is to transmit small amount of data infrequently. Thus, we propose a solution for minimizing interference from WiFi, while limits ZigBees occupancy rate. Another important point to be considered in this paper is that packet reception ratio (PRR) varies with the shape of crossing corridors. In general, there are typical shapes of L, T, and + depending on crossing corridors. Thus, a mobile ad-hoc network topology must be configured to transmit wireless packets via intermediate nodes. The method to be proposed in this paper to avoid interference is the use of channel hopping. This channel hopping occurs by evaluating of two values on receiver node: the latest received signal strength (RSS) values and the received acknowledged packets (ACK). The minimum RSS value is given to 50dBm to guarantee a reliable transmission. Our experiment shows that a receiver node with PRR less than 65% cannot receive two or more consecutive ACK packets. The another method to be taken in this paper to increase PRR depending on the type of crossing corridors is to deploy intermediate nodes with the shortest distance to its neighbours. This method conducts an efficient topology of multi-hop ad hoc wireless network
    Keywords: Ad-hoc wireless networks; indoor environment; coexistence; interference; channel hopping.

Special Issue on: Advanced Intelligence and Computing Technology

  • Diminution of Power in Load/Store Queue for CAM and SRAM based Out-of-Order Processor   Order a copy of this article
    by Dhanalakshmi Gopal 
    Abstract: In a modern world for non numeric applications, Out-of-Order super scalar processors are designed to achieve higher performance. Unfortunately the improvement in the performance has lead to the increase in the chip power and energy dissipation. The Load/Store queue is a one of the major power consuming unit in the data path design during dynamic scheduling. Load/Store queue is designed to absorb busts in cache access and maintain the order of memory operations by keeping all in-flight memory instruction in program order. The proposed technique aims at reducing both dynamic and static power dissipation in the Load/Store Queue (LQ/SQ) by using Power-gating technique and priority encoder. Through this implementation, the least amount of redesign, verification efforts, lowest possible design risk, least hardware overhead is achieved without significant impact on the performance.
    Keywords: Load /Store Queue; static Power; dynamic power; CAM; SRAM.

  • Design of an ultra-low power, low complexity and Low Jitter PLL with digitally controlled oscillator   Order a copy of this article
    by N.K. Anushkannan, H. Mangalam 
    Abstract: This paper proposes a new area-efficient, low-power and low jitter phased-locked loop (PLL) architecture working off a low frequency reference. In this paper, new PLL is proposed with a new locking procedure with low complexity which results in ultra low power design. The main challenge to design the proposed PLL is to keep the area small while meeting the required low jitter. The proposed method was designed using only two up-down counters for finding the reference frequency. An efficient glitch removal filter and new low power DCO also introduced in this paper. The proposed DCO achieves a reasonably high resolution of 1ps. The PLL architecture was demonstrated for different frequency ranges from 100 400 MHz. The power consumption of proposed PLL at 500 MHz frequency is 820
    Keywords: phase-locked loop; digitally controlled oscillator; low power; low complexity; low jitter; glitch removal.
    DOI: 10.1504/IJAIP.2018.10006418
  • Effective content based pattern predicted text mining using PSE model   Order a copy of this article
    by Vijaya Kumar 
    Abstract: The main importance of Pattern Searching Engine (PSE) model provides solution for the applications, which involves Pattern based mining and find connections between patterns (e.g: emotions) and affective terms by categorizing the text in the content under examination. It discovers patterns of word-use and how to connect documents that shared similar patterns. In Pattern Searching Engine model uses both theme based examination and idea based investigation, those can foresee the normal example by utilizing Semantic Based natural seeking model which interface words with comparative implications and recognize employments of words with different implications in a viable and speedy way.
    Keywords: Text mining; pattern based; pattern prediction; concept based.

Special Issue on: Advanced Intelligence Paradigms in Machine Vision, Image Processing and Pattern Analysis

  • Priority Based Trimmed Median Filter for Removal of High Density Salt and Pepper Noise   Order a copy of this article
    by Sudhakar R, Sudha V.K. 
    Abstract: This paper proposes an efficient and less complex Priority Based Trimmed Median Filter algorithm for restoring images corrupted by high density salt and pepper noise. Noisy pixel is replaced by trimmed median value of the horizontal and vertical adjacent four pixels, through this algorithm. If these four are 0s and 255s, then the next priority diagonal adjacent four pixels are used to calculate trimmed median for replacing noisy pixels. If these four are also found as 0s and 255s, then the noisy pixel is left unchanged until the next iteration. Experimental results on different gray scale and color images show that the proposed algorithm outperforms the Standard Median Filter, Adaptive median Filter, Decision Based Algorithm, Modified Progressive Switching Median Filter and Modified Decision Based Unsymmetric Trimmed Median Filter.
    Keywords: Salt and Pepper noise; Median filter; Adaptive Median Filter; Unsymmetric Trimmed Median Filter.

  • An Efficient approach for handling degradation in Character Recognition   Order a copy of this article
    by Sandhya N 
    Abstract: Recognition of historical printed degraded Kannada characters is not solved completely and remains as a challenge to the researchers still. In this paper a scale for measuring degradation of a character is proposed. Further the degradation is characterized to high, medium and low based on this scale, and use it to study the efficiency of the character restoration technique designed. A new approach, Fit Discriminant analysis (FDA) for recognition is proposed and compares its recognition accuracy with the existing techniques Support Vector Machines (SVM) and Fisher Linear Discriminant Analysis (FLD). Through extensive experimentation it is established that rebuilding of characters improves the recognition accuracy of learning based approaches SVM, FDA, and FLD significantly. Further it is established that the proposed approach FDA gives the best recognition accuracy for historical printed degraded documents. It is also proved that training-testing set applying the proposed degradation measure is required for better recognition accuracy.
    Keywords: Degraded characters; Support Vector Machines; Fisher Linear Discriminant Analysis; Broken characters.

  • Pattern Analysis and Texture classification using Finite State Automata scheme   Order a copy of this article
    by B. Eswara Reddy, Ramireddy Obulakonda Reddy 
    Abstract: The paper proposes a complete modeling of finite state automata along with the associated classifier for texture classification. Pattern analysis of the texture image is performed by proposing a symbolic pattern based algorithm. This algorithm is developed based on the symbolic dynamics and finite state automata theory for estimating the state transition of the texture variations. Texture image is divided into several partitions i.e. texture, background of the texture, shadow of the texture etc. Finite automata state transitions are used to extract the features from the symbolized image. A binary classifier is designed to classify the texture categories based on the feature extraction from the finite automata theory. Pattern analysis is performed on the KITH-TIPS dataset for 10 varied categories of texture. 99.12% Classification accuracy is achieved when compared with other state-of-art techniques. The experimental study shows the better efficiency of the proposed system when compared to other existing methods.
    Keywords: Finite automata; symbolic pattern; texture; classification.

  • A Novel Method for Super Resolution Image Reconstruction   Order a copy of this article
    by Joseph Abraham Sundar K, Vaithiyanathan V 
    Abstract: The paper describes about a new method for super resolution based on surveying adjustment. The idea used in this method is, an observation model is developed for the sequence of low resolution images and based on this an observation equation is developed for the Super-Resolution Image Reconstruction (SRIR). The observation equations are used by the surveying adjustments in order to find the gray function. The validation of proposed method is done using simulated experiments and real time experiments. These experimental results are compared with various latest techniques using performance measures like peak signal to noise ratio and sharpness index. In both the cases of experiments the proposed surveying adjustment based super resolution image reconstruction has proved to be highly efficient which is needed for satellite imaging, medical imaging diagnosis, military surveillance, remote sensing etc.
    Keywords: Super-Resolution; Image Reconstruction; Gray function; Observation model.

  • GLCM Based Detection and Classification of Microaneurysm in Diabetic Retinopathy Fundus Images   Order a copy of this article
    by Dhiravida Chelvi, Raja Mani, C.T.Manimegalai Murugeasn 
    Abstract: Pre-screening of eye is very important in Diabetic Retinopathy to help the ophthalmologists in providing relevant treatment. Diabetic retinopathy is a major cause of blindness and it includes the lesions like Microaneurysms, Haemorrhages, and Exudates. Microaneurysms are the first clinical sign of diabetic retinopathy and it is a small red dot on the retinopathy fundus images. These are early detectable signs in Diabetic Retinopathy which cause vision loss soon. The number of micro aneurysms is used to indicate the severity of the disease. The first step in preventing the disease is the automatic detection of the micro aneurysms at an early stage. The automatic identification of micro aneurysms reduces the manual workload and cost. A novel method of micro aneurysms detection for retinopathy images is proposed here. The proposed algorithm detects and classifies the Micro aneurysm from Diabetic Retinopathy Fundus images in low resolution images also. Initially the image is processed by a median filter and enhanced by Contrast limited Adaptive Histogram Equalization (CLAHE).Micro aneurysms are detected by extended minima method for candidate extraction. The PCA (principal component analysis) is used as a pre-feature extractor in terms of size, shape and colour of MA. To improve the efficacy of the system, finally statistical features are extracted by Gray level coocurrence matrix (GLCM) and are given to the Knn classifier to classify Micro Aneurysm accurately. These detected MA are validated by comparing with expert ophthalmologists hand-drawn ground-truth images. The simulation results show the performance such as sensitivity of 95.7%, specificity of 90.56%,and accuracy of 93% of the proposed algorithm.
    Keywords: Micro aneurysm; Diabetic Retinopathy; Image Processing; Pre-Processing; Image Classification.

  • Face Recognition using combined Binary particle swarm optimization and Hidden layer of Artificial Neural Network   Order a copy of this article
    by Charan S G 
    Abstract: Face Recognition is one of the challenging domains. We have seen Artificial Neural Network perform very well in both detection and recognition. In this paper, we propose a novel method of feature extraction where features obtained at the end of hidden layer of neural network is utilized. This hidden layer output is our first level of features. On these features we apply Binary Particle Swarm Optimization (BPSO) to remove the redundancy, the few hidden units in the network. BPSO over hidden layer outputs can be implemented in two ways: 1) to apply BPSO over hidden layer in the training stage so the network is better optimized; 2) to directly use the BPSO on an optimized neural networks hidden layer output. Both the techniques performed well over traditional neural network and conventional BPSO. Experiments on FERET and LFW datasets shows promising results.
    Keywords: Face Recognition; Hidden Data Mining; Particle Swarm Optimization; Artificial Neural Network; Hybrid Intelligent model.

  • Iris recognition system based on a new combined feature extraction method.   Order a copy of this article
    by Izem Hamouchene, Saliha Aouat 
    Abstract: Recent science studies are interested in automatic systems without human intervention. This concept is crucially needed in several researches and industrial world. Indeed, the security field is in a great need of automatic identification system based on the biometric (bioinformatics domain). The human iris is considered as the best biometric mark to the identification due to the stability, distinctiveness and unique features over time. Thus, the uniqueness of the texture present in the human iris is a natural password. This property is coveted by the field of security. In this paper, we propose a novel and an automated iris recognition approach. Our approach is based on a combination of two systems. The first system is based on the Regional Variation (RV) method. This method decomposes the iris image into several blocks. After that, the variation of the mean and the variance are encoded to generate the regional descriptors. The second system is based on a new feature extraction method called Rotation Invariant Neighborhood-based Binary Pattern (RINBP) Hamouchene and Aouat (2014). This method extracts the relative local information between the neighbors of pixels and is also robust against rotation. Two set of support vector machines (SVM) based learning algorithm is used to train the two systems. The output scores of the two systems are normalized. Dempster-Shafer theory is used to distribute unitary mass over the two output set of SVMs. Finally, the combined belief measures are transformed to a probability by applying the Dezert-Smarandache theory. In the experiments, the CASIA iris image database is used as a benchmark. The proposed systems are compared to famous iris recognition systems (Wildes Wildes (1997), Masek Masek (2003), Han et al Han et al. (2014), Rai et al Himanshu et al. (2014) and Izem et al. Hamouchene and Aouat (2014)). The experiments illustrate that the proposed recognition system has obtained better recognition rates. Experimental results illustrate the efficiency of the proposed iris recognition system especially the feature extraction methods (RV and RINBP) and the decision model which give promising results.
    Keywords: Iris Recognition System; Neighborhood-based Binary; Texture analysis; Mean and variance variations; Dempster-Shafer theory; Support Vector Machines.

  • Enhanced method of using Contourlet transform for medical image compression   Order a copy of this article
    by Eben Sophia P, Anitha J 
    Abstract: With the aim of improving the compression performance using contourlet transform, Singular Value Decomposition (SVD) of intermediate subbands has been experimented. In this way, the size of contourlet transform subbands can be efficiently reduced to induce compression. This novel lossy compression technique enhances the compression performance of contourlet transform and produces good quality image even at lower bit rates. In addition to SVD, normalization and prediction of decomposed sub band coefficients also improve the compression performance. The method was tested using medical MRI (Magnetic Resonance Imaging) and CT (Computed Tomography) imaging modalities. The statistical results confirm the efficiency of the proposed method in terms of CR (Compression Ratio), PSNR (Peak Signal to Noise Ratio) and BPP (Bits Per Pixel). This method produces good compression with approximately 47 dB PSNR at bit rate as low as 0.1BPP. This is suggested good for medical image communication and storage applications such as PACS (Picture Archiving Communication System), RIS (Radiology Information System) etc. and also helps in easy search and retrieval process.
    Keywords: Contourlet transform; singular value decomposition; prediction; lossy compression; Arithmetic coding; medical MRI and CT images etc.

  • Video-based assistive aid for blind people using object recognition in dissimilar frames   Order a copy of this article
    by Hanen Jabnoun, Faouzi Benzarti, Frédéric Morain-Nicolier, Hamid Amiri 
    Abstract: Developing visual aids for handicapped persons is an active research area in the computer vision community. This paper presents a visual substitution tool for blind people based on object recognition in video scene. It focuses on the optimization of the video processing using the calculation of dissimilarity between frames. The approach includes the Real Valued Local Dissimilarity Map method in the frames dissimilarity measures. It uses Scale Invariant Features Transform keypoints extraction and matching to identify objects in dissimilar frames. The experiment tests showsome encouraging results in the case of finding object of interest. Thus, the proposed method can be a choice for solving the problemof blind and disabilities persons in their interactionwith the surrounding environment.
    Keywords: Pattern recognition; video processing; visual substitution system; Scale Invariant Features Transform; Real Valued Local Dissimilarity Map; keypoints matching.

  • Brachiopods classification based on fusion of Contour and Region based descriptors   Order a copy of this article
    by Youssef Ait Khouya, Faouzi Ghorbel 
    Abstract: In this paper, we propose a Contour-Region based shape descriptor for Brachiopods classification by using a combinations of Fourier descriptors and R-transform extracted from Radon transform. Fourier descriptors is supported by the well-developed and well-understood Fourier theory and are a powerful features for the recognition of two-dimensional connected shapes. We used the stable and complete Fourier descriptors proposed by Ghorbel to present the contour information. To depict shape interior content we used the R-transform. It's advantages lies in its low computational complexity and geometric invariance. We compared the proposed descriptor with Curvature Scale Space, R-transform and Ghorbel descriptors using City block distance measure and our Brachiopods database. We present the experimental results to reveal the performance of the proposed descriptor which is independent on the starting points and efficient.
    Keywords: Brachiopod;Fourier descriptors; Radon transform; R-transform; Curvature Scale Space.

  • Identification of Human Activity Pattern in Controlled Web Environment: An Adaptive Framework   Order a copy of this article
    by A. Chakraborty, D. Banerjee, R.T. Goswami 
    Abstract: This paper projects a new aspect regarding the research works based on human web based activity pattern analysis. Web activity pattern analyzer is basically a part of the main goal of the research, human psycho emotional behavioural pattern analysis. In the recent era, human world is majorly dependant on internet in their life for various aspects and that is why internet usage pattern of each individual user is growing as a very powerful resource to know him. The need of web users can be mitigated more efficiently if their requirements are known to the providers. These usage patterns are found to be unique for each user, to some extent, as per the current psycho emotional state of that individual user when he or she is in a controlled web environment and this can be another mark of authentication of that particular user. This concept has already been applied in some real world application domains namely; User Authentication Protocol, Personalized E-Learning and Link Data Analysis for Resource Description Framework in Semantic Web.
    Keywords: Session_Sequence; Activity Pattern; Adaptive Algorithm; Dempster–Shafer theory; Belief Function; Recommender Agent; RDF Graph.

Special Issue on: Advanced Pattern Recognition and Soft Computing Paradigms

  • Using a soft computing method for impedance modelling of li-ion battery current   Order a copy of this article
    by Mohammad (Behdad) Jamshidi, Rouzbeh Farhadi, Morteza Jamshidi, Zahra Shamsi, Seyedfoadin Naseh 
    Abstract: Using the soft computing as a powerful tool for modelling of complex systems is highly regarded. Adaptive neuro fuzzy inference system is one of the best methods of soft computing which identifies and models non-linear systems. In this paper, complex impedance behaviours of li-ion batteries are studied by adaptive neuro fuzzy inference system. To present an approach for modelling and identification of electrochemical systems is purposed. This method can be improved to reach the most accurate model of the batteries. In the presented work, complex current is modelled as the main important element of the batteries in impedance state. Modelling results showed that this method can have acceptable output for impedance modelling the batteries.
    Keywords: Electrochemical; impedance modelling; li-ion battery; soft computing; complex systems; systems engineering; ANFIS.

  • Provide a new clustering scheme based on density to enhance energy efficiency in wireless sensor networks   Order a copy of this article
    by Mahdis Fathi, Mousa Nazari 
    Abstract: The study and researches related to wireless sensor networks (WSN) are growing today due to its various uses in different fields. Wireless sensor network includes many small nodes that have been located in an intended environment. Since the dimensions of these sensors are small, they work with non-rechargeable batteries as energy limited instruments. So, energy conservation is very important. Clustering the sensor nodes is an effective way to diminish the consumed energy of these networks. Accordingly, a novel clustering scheme which is based on density-based clustering approach is presented in this article. In this new method, nodes that have been located within proximity of each other, are placed in one cluster and unlike some algorithms, it is no need to determine the exact number of clusters. Simulation outcomes indicate that lifetime and total packet delivery of proposed method have been increased rather than other related methods.
    Keywords: clustering; density-based; energy efficiency; wireless sensor networks; WSNs.

  • Energy-aware traffic engineering in IP networks using non-dominated sorting genetic II algorithm   Order a copy of this article
    by Raheleh Samadi, Mohammad Nassiri, Muharram Mansoorizadeh 
    Abstract: Wide spreading of computer networks along with increasing traffic demand throughout the Internet caused a dramatic increase in energy consumption by networking devices and Internet infrastructure. Energy-aware traffic engineering is a promising approach towards green networking to achieve a trade-off between energy saving and network utilization in backbone networks. In this paper, we propose to use non-dominated sorting genetic algorithm (NSGA-II) for energy-aware intra-domain traffic engineering. This algorithm tries to make a tradeoff between maximum link utilization (MLU) and energy reservation. For each pair of network topology and traffic matrix, NSGA-II computes the optimal set of links to put to sleep so that the resulting topology would be able to carry the traffic demand. We developed a simulator to evaluate the performance of our mechanism. The results of comprehensive evaluations show that our energy-aware TE approach increases the network performance in terms of energy conservation by 50% at the cost of slight increase in maximum link utilization.
    Keywords: Energy saving; Traffic engineering; Link utilization; Genetic algorithm; non-dominated sorting.

  • A Comparison of Data mining Methods for Diagnosis and Prognosis of Heart Disease.   Order a copy of this article
    by Mohammad Reza Afrash, Mehdi Khalili, Maral Sedigh Salekde 
    Abstract: Heart disease is a term that covers a range of disorders that affect heart. Since medical decisions are still mostly based on the knowledge and experience of doctors and not on the basis of hidden knowledge in numerous cases Patient records, so this action is exposed to human errors, which may lead to late discovery of disease or influenced how services offered to patients. So create automatic or semi-automatic detection system with a combination of both knowledge and experience in the field of health care is very useful and necessary. Here, this paper compare data mining algorithm for diagnosis and prognosis heat disease as an automatic intelligent heart disease prediction system. Accordingly firstly we use data set with 14 attributes. Secondly, we develop a prediction model using Na
    Keywords: Keywords: data mining techniques; heart disease; classification; weka;.

Special Issue on: Nature-inspired Computing and Its Applications

  • Improving the Search Efficiency of Differential Evolution Algorithm by Population Diversity Analysis and Adaptation of Mutation Step Sizes   Order a copy of this article
    by Dhanya M. Dhanalakshmi, M.S. Akhila, C.R. Vidhya, Gurusam,y Jeyakumar 
    Abstract: Abstract: The aim of this research work is to improve the efficiency of Differential Evolution (DE) algorithm, at the cases of its unsuccessful searches. Initially, this work discusses and compares different methods to measure the population diversity of DE algorithm implemented for DE/rand/1/bin variant for a set of benchmarking functions. A method which well demonstrates difference in population diversity evolution at successful and unsuccessful cases of DE search is identified based on comparison. This work is then extended to detect unsuccessful searches in advance using the evolution of population diversity measured by the identified method. On detecting a search as unsuccessful, a parameter adaptation strategy to adapt the mutation step size (F) is added to DE algorithm to recover from it. The improved DE algorithm, which comprises of the logic of adapting F value based on the population diversity, is compared with its classical version and found outperforming. The comparison results are reported in this paper.
    Keywords: Differential Evolution; Premature Convergence; Stagnation; Mutation Step Size; Parameter Adaptation; Population Diversity; Population Variance.

  • Towards Real-time Recognition of Activities in Smart Homes   Order a copy of this article
    by Sook-Ling Chua, Lee Kien Foo, Saed Juboor 
    Abstract: Many supervised methods have been proposed to infer the particular activities of the inhabitants from a variety of sensors attached in the home. Current activity recognition systems either assume that the sensor stream has been pre-segmented or use a sliding window for activity segmentation. This makes real-time activity recognition task difficult due to the presence of temporal gaps between successive sensor activations. In this paper, we propose a method based on a set of hidden Markov models that can simultaneously solve the problem of activity segmentation and recognition on streaming sensor data without relying on any sliding window methods. We demonstrate our algorithm on sensor data obtained from two publicly available smart homes datasets.
    Keywords: Real-time; Activity Recognition; Activity Segmentation; Streaming Data; Hidden Markov Model.

  • Supervised Approach for Object Identification using Speeded Up Robust Features   Order a copy of this article
    by Pooja Agrawal, Teena Sharma, Nishchal K. Verma 
    Abstract: This paper proposes a vision based novel approach for real-time object counting. The proposed approach uses the textural information for object counting. Speeded Up Robust Features (SURF) are used to extract the textural information from the image. Firstly, the approach selects stable SURF features from prototype image object of interest. These features are matched with the SURF features of scene image captured using vision interface. Feature Grid Vectors (FGVs) and Feature Grid Clusters (FGCs) are formed for matched SURF features in the scene to indicate the presence of object. Support Vector Machine (SVM) Learning is used to identify true instances of the object. A parameter tuning approach is used to find optimized heuristics for more accuracy and less computation. The proposed approach performs well irrespective of illumination, rotation and scale. A run time environment of the proposed approach is also developed to get real-time status of the object count.
    Keywords: Object identification; object counting; SURF; SVM classifier; feature grid vector; feature grid cluster.

  • Optimal Design of QFT Controller for Pneumatic Servo Actuator System using Multi-objective Genetic Algorithm   Order a copy of this article
    by Nitish Katal, Shiv Narayan 
    Abstract: Loop shaping is the principle step for synthesizing the Quantitative Feedback Theory (QFT) based robust controllers. The controller assures performance robustness in the presence of plant uncertainties. This paper explores a template and bounds free approach for the automated synthesis of low order fixed structure QFT controller for a highly uncertain pneumatic servo actuator system. In this work, the loop-shaping problem has been posed as a multi-objective optimization problem and solved using the multi-objective variant of the genetic algorithm. At the end of the design process, a set of Pareto optimal solutions (POS) are obtained, to aid the decision maker in choosing an ideal solution from the POS, use of level diagrams has been explored. The simulation of the results and time and frequency domain analysis has been carried out using Matlab and the results obtained clearly unveil that the designed QFT controller offers robust behavior over a range of plants parametric uncertainty.
    Keywords: Quantitative Feedback Theory; Multi-objective Genetic Algorithm; Automatic Loop Shaping; Robust Stability; Level Diagrams.

  • Hybrid BATGSA: A Meta Heuristic Model For Classification of Breast Cancer Data   Order a copy of this article
    by Umme Salma M, Doreswamy H 
    Abstract: Nature inspired algorithms have a vast range of applications. One such application is in the field of medical data mining where, major focus is on building models for the classification and prediction of various diseases. Breast cancer has grabbed the interest of numerous researchers because, it is the major killer disease, killing millions of women across the globe. In this paper, we propose a hybrid diagnostic model which is a fusion of Bat Algorithm (Bat), Gravitational Search Algorithm (GSA), and feed forward neural network (FNN). Here, the potential of the FNN and the advantages of nature inspired algorithms have been exploited to build a hybrid model used for classification of breast cancer data. The proposed model consists of two modules. First, is the training module where the data is properly trained using a feed forward neural network and the second, is an error minimizing module, which is built using Bat and GSA meta heuristic algorithm. The hybrid model minimizes the error thus, producing better classification results. The accuracy obtained for Wisconsin Diagnostic Breast Cancer (WBCD) data set is found to be 94.28% and 92.10% for training and testing respectively.
    Keywords: Breast Cancer; Bat algorithm; Gravitational Search Algorithm; Classification; Metaheuristic.

Special Issue on: New Trends for Security in Network Analytics and Internet of Things

  • A Novel Encryption Compression Scheme using Julia sets   Order a copy of this article
    by Kunti Mishra, Bhagwati Prasad 
    Abstract: The intent of the paper is to propose a novel fractal based encryption compression scheme using logistic map and Julia sets. In our study of medical images, we obtain significant lossless compression and secure encryption of the image data. The proposed technique is expected to be useful for the transmission of various confidential image data relating to medical imaging, military and other multimedia applications.
    Keywords: Logistic map; Encryption; Decryption; Compression; Julia sets.

  • Perplexed Bayes Classifier based Secure & Intelligent Approach for Aspect Level Sentiment Analysis.   Order a copy of this article
    by Sumit Kumar Yadav, Devendra K. Tayal, Shiv Naresh Shivhare 
    Abstract: In this work, we are using machine learning methods to classify a review document. We are using two machine learning methods - Naive Bayes Classifier and Perplexed Bayes Classifier. First we will briefly introduce the Naive Bayes Classifier, its shortcomings and Perplexed Bayes Classifier. Further, we will be training the classifiers using a small training set and will use a test set with reviews having dependency among its features. We will then show that how Naive Bayes Classifier fails to classify such reviews and will be showing that Perplexed Bayes Classifier can be used to classify the given test set, having dependency among its features.
    Keywords: sentiment-analysis; machine-learning techniques; naïve bayes; perplexed bayes; aspect level sentiment analysis.

  • An Efficient Crypto-compression Scheme for Medical Images by Selective Encryption using DCT   Order a copy of this article
    by Med Karim Abdmouleh, Hedi Amri, Ali Khalfallah, Med Salim Bouhlel 
    Abstract: Nowadays, modern communication inevitably uses computer networks. The Images transmitted across these networks are special because of their large amount of information. Thus, the use of the information technology in the medical field generates many applications (especially telemedicine) where the exchange of medical information remains the foundation of their success. The transmission of these images raises a large number of unresolved problems. The efficiency of a transmission network depends, on the one hand, on the degree of security and, on the other hand, on the times of transmission and archiving. These requirements can be satisfied by encryption and compression. This work presents a method of a partial or selective encryption for medical Images. It is based on the encryption of some quantified Discrete Cosine Transform (DCT) coefficients in low and high frequencies. The results of several experiments show that the proposed scheme provides a significant reduction of the processing time during the encryption and decryption, without tampering the high compression rate of the compression algorithm.
    Keywords: Crypto-compression; Medical image; Telemedicine; DCT; RSA.

  • Hybrid Approach to Enhance Contrast of Image for Forensic Investigation Using Segmented Histogram   Order a copy of this article
    by Sachin Dube, Kavita Sharma 
    Abstract: Digital images can be used in detection of various crimes, ranging from active to passive attack applications. To suit a particular attack application an image needs to be enhanced, and should have good quality in general for forensic investigation. For normal investigation use; vibrant, vivid and eye pleasing image is desired. In this paper, various existing methods and their drawbacks are examined. This information is then used to develop an approach for contained enhancement to retain natural look of image, enhance its quality to make it usable for evidence. Existence of a spike in histogram can result in Over-enhancement of image. Spike is created when a large no. of pixels are having small set of intensities. Ten most commonly used standard images are used for performance comparison. Proposed method outperforms compared methods in terms of PSNR and AMBE values, while keeping entropy and standard deviation almost similar to input image.
    Keywords: Image Forensic; Segmented Histogram; Image Contrast Enhancement;.

  • Use of A Light Weight Secure Image Encryption Scheme Based on Chaos & DNA Computing for Encrypted Audio Watermarking   Order a copy of this article
    by Bhaskar Mondal, Tarni Mandal, Tanupriya Choudhury 
    Abstract: Watermarking is one of the best way to authenticate the ownership or the source of data by embedding copyright information onto the image, audio or video. At the same time to maintain anonymity or source of data from unintended users its need to encrypt before embedding. This paper presents an effective use of encryption algorithm in audio watermarking. The watermark data is initially encrypted with A Light Weight Secure Image Encryption Scheme Based on Chaos DNA Computing". In the second part, the encrypted data embedded onto an audio using Discrete Cosign Transformation (DCT) and Discreet Wavelet Transformation (DWT). The test results are promising and the watermarked audio does not looses its quality.
    Keywords: Audio watermarking; cryptography; deribonucleic acid (DNA); watermark encryption.

  • Malware Intelligence: Beyond Malware Analysis   Order a copy of this article
    by Ekta Gandotra, Divya Bansal, Sanjeev Sofat 
    Abstract: A number of malware samples are available online but a little research has attempted to thoroughly analyze these for obtaining insights or intelligence about their behavioral trends, which can further be used to issue early warnings about future threats. In this paper, we have performed an in-depth analysis of about 0.1 million historical malware specimens in a sandbox environment to generate their attributes and behavior. Afterwards, the intelligent information is mined using statistical analysis to study their behavioral trends and capabilities. The information so obtained can help to gain insight into the future measures that malware authors can use to design their programs. The paper also highlights the challenges evolving out of these trends which provide the future research directions to malware analysts and security researchers. Further, the insights generated can be shared with security Experts, CERTs (Computer Emergency Response Teams) or other stakeholders so that they can issue the preventive measures for future threats or at least to minimize the risks posed by them. Furthermore, this type of analysis facilitates research community in selecting the parameters/factors for building faster and improved techniques for detecting unknown malicious programs.
    Keywords: Malware analysis; statistical analysis; security intelligence; behavioral trends; prediction.

  • Trust evaluation of websites: A comprehensive study   Order a copy of this article
    by Himani Singal, Shruti Kohli 
    Abstract: People rely heavily on internet to fulfill even the minuscule of their need. According to a survey, 41% of time spent on web is for finding some information from search engines or reading some information. This is majorly due to easily accessible, cost effective and perceived high value information. But, this perceived high value information can prove fatal, if consumed without any authoritarian checks; especially if related to issues like health. Some template is necessitated to measure trustworthiness of such information. This paper explores a novel approach to quantify trust in such information-led websites. Analytical data is collected for various informational websites using and trust is modeled for these websites using human behavior as an aggregate. Analytical data is believed to capture actual behavior of each and every visitor visiting the website for information; thus making the study reliable and dependable. Results have been compared with some other acceptable studies and have found to be encouraging.
    Keywords: Content Trust; Health Information; Medical Trust; Online Interaction; User Satisfaction; Web Trust.

  • An Epidemic Model for Security and Performance of Wireless Sensor Networks   Order a copy of this article
    by Rudra Pratp Ojha, Kavita Sharma, Pramod Kumar Srivastava, Goutam Sanyal 
    Abstract: Wireless sensor networks have imminent constrains that makes security a crucial issue.Transmission of starts from a single node and spread in the entire network through wireless communication. This process leads to the failure of whole wireless sensor network. The proposed mathematical model based on epidemic theory in which the different class of nodes considered and to examine the effect of different class on the network and develop control mechanism to prevent worm transmission in the sensor networks. Discuss the role of communication radius on the stability of net-work. We examine the proposed model using stability theory of differential equation. Determine the basic reproduction number and relate with communication radius. Analyze the proposed model that improves the efficiency of the network in terms of stability and energy efficiency. Validate the proposed model through extensive simulation results.
    Keywords: Epidemic model; Wireless Sensor Network; Equilibrium; Stability; Communication Radius; Basic reproduction number.

  • Secure Handoff Technique with Reduced Authentication Delay in Wireless Mesh Network   Order a copy of this article
    by Geetanjali Rathee, Hemraj Saini 
    Abstract: The aim of manuscript is to propose a secure handoff procedure by generating the tickets for the mesh clients which are divided into different zones of mesh routers according to their communication range. An authentication server looks over the entire network after a specific interval of time and is responsible for generating and updating the corresponding tickets of clients according to their zonal routers range. Whenever a mesh client enters into the range of another domain, to access the services from foreign mesh routers, roaming client has to prove its authenticity to the corresponding zonal router. Each mesh router stores the ticket of its zonal mesh client issued from authentication server and validates the roaming client by matching the ticket. The proposed mechanism reduces the issue of storage overhead and security threats at mesh client as all the tickets are stored in authentication server database and are issued upon the request. The proposed technique is validated over authentication delay and different probabilistic scenarios of authentication and is proved legitimate by discussing an empirical study against reported literature.
    Keywords: Wireless Mesh Network; secure handoff; authentication; security threats; network delay; storage overhead.

  • A Secure, Fast Insert and Efficient Search Order Preserving Encryption Scheme for Outsourced Databases   Order a copy of this article
    by K. Srinivasa Reddy, Ramachandram S 
    Abstract: Order Preserving Encryption (OPE) schemes have been studied to a great extent in the cryptography literature because of their potential application to database design. For the first time, a scheme called mutable order preserving encoding (mOPE) is introduced to achieve IND-OCPA (Indistinguishability under Ordered Chosen Plaintext Attack) security. However, even mOPE scheme potentially leaks the distribution of repeated ciphertexts and is less efficient. In this paper, a new scheme is introduced called as a Secure and Cost efficient Order Preserving Encryption (SCOPE), which is considerably more secure and efficient than mOPE scheme. A new form of strong security notion called as Indistinguishability under Ordered Chosen Repeated Plaintext Distribution Attack (IND-OCRPDA) is proposed and we show that SCOPE scheme is IND-OCRPDA secure. Finally, the experimental results show that SCOPE achieves good performance in the context of an encrypted database and have a reasonable overhead which is 3.5
    Keywords: efficiency; functionality; order preserving encryption; trusted proxy; security.

  • Security Model against worms attack in Wireless Sensor Network   Order a copy of this article
    by Rudra Pratap Ojha, Pramod Kumar Srivastava, Goutam Sanyal 
    Abstract: TThe Wireless Sensor Network is an innovative category of communication network,which has earned universal attention due to its great potential in application of various areas.This is one of the insecure system due to attack of worms.In order to efficaciously defend wireless sensor network against worms, we have proposed an epidemic model with two latent periods and vaccination.We have formulated ODE of the model and studied the dynamic behavior of worm propagation as well as designed a model to secure the system from worm attack.The model has been simulated by MATLAB. In this proposed study, we have determined the basic reproduction number for the study of dynamic performance of worms in the wireless sensor network. The global stability of worm free equilibrium has been established using a Lyapunov function, while the simulation results helped in validation of the theoretical analysis.
    Keywords: Security; Epidemic model; Wireless Sensor Network; Latent period; Basic reproduction number.

  • Untraceable privacy-preserving authentication protocol for RFID tag using salted hash algorithm   Order a copy of this article
    by Pinaki Ghosh, Mahesh TR 
    Abstract: Radio Frequency Identification (RFID) is now becomes a core technology in the Internet of Things (IoT). It has gained the attention of industry and academia in tremendous ways. Due to the openness in nature, RFID tags suffer with potential security threats. One of the major threats is privacy leakage during the authentication process. A strong Privacy Preserving Authentication (PPA) protocol is always a need to this system. In this paper we have proposed salted secure hash based mutual authentication protocol as a solution. The proposed protocol is designed to sends random response from tag to the server without disclosing its identity information to intermediate entities like readers. It also updates secret keys without transmitting the secret values.
    Keywords: RFID; privacy; untraceability; tag authentication; salted hash; keyed hash algorithm; mutual authentication.

  • Comparison of different RSA Variants   Order a copy of this article
    by Seema Verma, Manoj Kumar 
    Abstract: RSA is the first public key algorithm used for encryption and decryption. Its simplcity and complexity lies in factoring a very large composite integer. It is still popular even after thirty nine years of its origin. In this long journey, RSA is studied many times and many security loopholes are found. To remove the loop holes researchers designed many variants of RSA. The work shows the study of different RSA variants which are popular in literature. This study includes the analysis in terms of performance and security of different RSA variants.
    Keywords: RSA; Public key; Cryptography; Encryption; Complexity; Security; Comparison.

  • GASER: Genetic Algorithm based Secure and Energy aware Routing protocol for Sparse Mobile Ad Hoc Networks   Order a copy of this article
    by Deepika Kukreja, Deepak Kumar Sharma, S.K. Dhurandher, B. V. R. Reddy 
    Abstract: Sparse Mobile Ad hoc Networks are characterized by sparse node deployment and longer network partitions. Nodes in an ad hoc network are mobile, have limited energy and are deployed in areas where connections between the nodes may be inconsistent. In a number of scenarios it is likely that the route between source-destination pair does not exist for longer duration of time. Routing in such a network where nodes deployment is sparse and the connections between the nodes occur less frequently is a challenging task. In this paper, nature inspired Genetic Algorithm based Secure and Energy aware Routing (GASER) protocol for Sparse Mobile Ad Hoc Networks is proposed. Black hole and gray hole attacks are two security threats that make Mobile Ad Hoc Networks (MANETs) weak by inducing packet forwarding misbehavior in the network. By incorporating genetic algorithm with other methods, the GASER protocol selects the best path for routing the packets between source and destination in such a way that the selected path is shortest. Nodes of the selected path have highest message forwarding possibility among the other nodes of the network and have enough energy to receive and then forward messages. GASER avoids the nodes inducing gray hole/black hole attack in the network as it selects the next hop having more message forwarding probability thus making the routing protocol secure. Simulation results prove that GASER outperforms PROPHET, Epidemic and Spray and Wait in terms of packet delivery ratio, average residual energy, overhead ratio and number of deceased nodes.
    Keywords: Sparse Mobile Ad Hoc Networks; Genetic algorithm; Black hole attack; Gray hole attack; Energy aware routing; Secure routing.

Special Issue on: Sensor Networks and Cloud Computing

  • A Review on Congestion Control System using APU and D-FPAV in VANET   Order a copy of this article
    by Christy Jackson, Vijayakumar V 
    Abstract: Over the last few years, Vehicular Adhoc Networks has been playing a vital role in many researches around the world. Vehicular Adhoc networks (VANET) have a wide range of application in which Intelligent Transport Systems (ITS) is a major area. Application such as safety, entertainment on the go and traffic advisor are some of the recent advances in VANET. This paper addresses the issues concerning the vehicular traffic congestion. It is observed that United States and United Kingdom lose 2% and 5% of its Gross National Product (GNP) due to traffic congestion. The Paper provides a review on two congestion mechanisms, Adaptive Position Update (APU) and Distributed Fair Transmit Power Adjustment in VANET (D-FPAV). Adaptive Position Update (APU) is a strategy in which it dynamically adjusts the frequency of position updates based on the mobility dynamics of the nodes and the forwarding patterns in the network. D-FPAV controls congestion by adjusting the node transmission power, where the nodes transmit power setting depends on predictions of application-layer traffic and the observed number of vehicles in the surrounding. The paper includes a simulation of a normal VANET setup without congestion and a VANET setup with APU and D-FPAV employed. The simulation results prove that employing these congestion techniques reduce the time delay caused due to traffic congestion.
    Keywords: APU; D-FPAV; GPSR; VANET.

  • Dissection of the Experimental Outcome of Split-protocol   Order a copy of this article
    by Bharat S Rawal, Qiang Duan, Pandi Vijayakumar 
    Abstract: The Split-protocol theory was developed for load balancing and quicker data communication. Split-protocol computing paradigm uses web services on geographically distributed Web servers on the cloud. A system of large Split-servers that form the cloud to handle computing and storage task that would otherwise create a massive CPU utilization if we work with the traditional individual server. In an earlier paper, we established that the application Split-protocol produces higher performance compared to traditional clusters. Based on the necessity, diverse types of split configurations are applied for higher throughput, better response and connection time. The experiential throughput enhancement was within the range 6.5% - 25% over non-split systems. This paper examines empirical results split systems to understanding its behavior compared with non-split systems. Split-protocol was implemented on private Cloud for internal data servers of the organization, not made available to the general public. The split concept emerged from the HTTP/TCP/IP network protocol implementation. The split-system model with given sets of constraints can produce better throughput than conventional equivalent server systems. In this paper, we have presented the analytical model to support a high performance of Split-protocol implementation. We have also mathematically evaluated the inherent reliability characteristics of the split-system.
    Keywords: parallel computing; performance; pipeline; splitting http requests; cloud computing.

  • Energy Saving Offloading Scheme for Mobile Cloud Computing using CloudSim   Order a copy of this article
    by Thanapal P, Saleem Durai M A 
    Abstract: In this paper, we accentuate on offloading for sparing energy in Mobile Cloud Computing (MCC). MCC is emerging as a noticeable research area that is looking to bring the massive advantages of the cloud to the constrained mobile devices. The impediments of poor processing capacity and constrained battery life make it troublesome for mobile devices to process complex calculation undertakings. The energy system concentrates on mobile devices battery energy that utilizes computational offloading for computationally serious mobile applications on the mobile devices. Each time an application in the physical mobile devices is initialized, the power utilization is ascertained by the structure and it decides whether to make on offloading choice or not. This paper proposes a novel Energy Saving Computational offloading framework for the preparing of serious mobile applications in MCC. It is found that energy utilization of the selected application reduces up to 80.94% and execution time reduces up to 97.02 % by computational offloading utilizing CloudSim when contrasted with the conventional methods. It enhances the quality of services for mobiles and helps in keeping up constant reactions for mobile applications.
    Keywords: Cloud computing; Energy efficiency; Offloading; Green computing; Mobile cloud computing; Mobile device; Computational offloading; save energy; CloudSim.

    by Luke Christie, Gajendra Kumar 
    Abstract: Research in Artificial Intelligence and Cognitive Computing(simulation of human thought processes in machines involving self-learning systems and processes that efficiently use pattern recognition such as detecting voice, or acquiring data(data mining) and imitating human thoughts in machines) is an evolutionary and innovative field acquiring knowledge from liberal arts, humanities such as management science, philosophy, psychology. The need to rely on these disciplines is of vital importance and of strategic value in todays technology up-gradation as robots are designed to think and behave like human-beings. Robotic technology can assess and solve problems through the aid of human intervention and support. Issues that are of a laborious nature can be completed and solved through machine technology and processes or machines and humans must work together to solve larger problems of which the dire need is where humans and machines need to work together in solving problems of global warming and climate change. My paper will evaluate that AI and Cognitive Computing processes do make life easier, convenient and faster for human-beings. Take for instance, the procedures where human-beings were replaced by machines on an Assembly line that manufactures cars. The need of the hour in todays age and society is where human-beings must put machines to work and invest knowledge and techniques in combating global warming and climate change through intelligence and smart thinking. My argument will come from assessing issues that are man-made which can be solvable through the aid of technology. My argument borders on the fact that with the intelligence that machines have acquired through the ages can be utilized tremendously in solving major problems that we face today, such as global warming and climate change. AI and Cognitive Computing is used against security threats and in NASA for their bold ventures on space research. Indias investment on technology through foreign direct investment, her start-up initiatives on technology will offer a glimmer of hope if the technology acumen is used to solve tough challenges from climate to healthcare to energy and security issues without affecting the economic position of being an invaluable economic player in the global economy.
    Keywords: Cognitive Computing; Artificial Intelligence; Low-carbon economy; renewable energy; clean energy.

  • Increased Level of Security using DNA Steganography   Order a copy of this article
    by Vijayakumar Perumal, Rajashree R, Vijayalakshmi V 
    Abstract: Security has become one of the important fields of concern when the scenario of information transport is viewed. Cryptology is a traditional technique that hides the message by encoding it. Nevertheless, now it is not only demanded that the data that is being transmitted is encoded, but the very existence of knowledge of data transmission should be concealed. Steganography is thus a technique where the data are not only encoded, but hidden within a carrier. This therefore provides a two-level security. The proposed paper describes a highly secured Steganography technique using DNA sequence. The report suggests a technique of hiding an image in an image using DNA sequence.
    Keywords: Deoxyribonucleic Acid (DNA); Cryptography; Polymerase Chain Reaction (PCR); Steganography; Cover image; Secret image; DNA Steganography; European Bioinformatics Institute (EBI).

  • Power modeling of sensors for IoT using Reinforcement Learning   Order a copy of this article
    by Pradeep Kumar TS, Venkata Krishna P 
    Abstract: Internet of Things (IoT) is a technology where all things like household equipments, industrial elements, etc. are monitored by sensors and controlled by actuators. For a large scale IoT Application, sensors are needed in huge numbers and all these sensors are powered by small battery. Hence these miniature devices lifetime can be improved by means of optimizing the power and hence modeling of these sensors is a must for such application. This paper models the sensors for IoT application in the multi layered IoT network. Reinforcement learning is used for modeling the sensors that model in the physical, routing and network layer. EEIT framework is used to model the nodes that optimize energy consumption in physical, routing and network layer. Physical layer modeling deals with the hardware aspects like transmission power, radio, etc. of the sensors. Routing and networking layer deals with the communication (transmitting and receiving data, dissemination, routing, etc.) capabilities of the sensors. We conduct numerical simulations and emulations using EEIT framework for IoT systems that are helpful for the design for complex IoT Systems. Our results are quantified empirically based on the facts lifetime of the sensors, energy usage and communication costs.
    Keywords: Wireless sensor networks; Reinforcement Learning; Internet of Things (IoT); Sensors.

    by Subramaniyaswamy Vairavasundaram, Logesh R 
    Abstract: During a cricket match, the commentary is the important thing which keeps the viewers entertained and updated about the live game. The commentary can be made interesting by narrating relevant stories related to live scenario of the sports game. But the knowledge of commentators however wide for humans, is relatively less compared to that of the story details available about individual players. Here a system is presented called iSCoReS for Individuals that can automatically suggest stories for commentators to tell during the live game. The iSCoReS for Individuals is, equipped with some scored examples of story-game state pairs, which learns offline to connect sports stories of individual players to game states. Features describing the viewer would be input into the system, and stories could be selected partly based on these features. The problem addressed is to find and match the interesting stories as live commentary to a sports game for the individuals.
    Keywords: Artificial Intelligence; Information Retrieval; Sports broadcasting; Colour Commentary; Human Computer Interaction; Automation.

  • False Data Detection and Dynamic Selection of Aggregator Nodes with Pair-wise Key Establishment in Homogeneous Wireless Sensor Networks   Order a copy of this article
    by Sandhya M.K, Murugan K, Devaraj P 
    Abstract: Compromised sensor nodes inject false data in wireless sensor networks which distorts data integrity and consumes battery power unnecessarily. In the existing false data detection schemes, the aggregator nodes suffer from rapid battery drain due to the computational overhead, leading to reduced network lifetime. To avoid this, the aggregator nodes must be dynamically selected from the sensor nodes in the network. This dynamic selection of aggregator nodes introduces security challenges in symmetric key exchange among the sensor nodes. In this paper, a scheme called, False Data Detection - Dynamic Selection of Aggregator Nodes is proposed to address these issues. This scheme discards the false data injected into the network and also prolongs the network lifetime by the dynamic selection of aggregator nodes. The problem of symmetric key exchange arising due to the dynamic selection of aggregator nodes is resolved by the proposed Chebyshev Polynomial based pair-wise key establishment which has lesser computational overhead and offers better security strength. Simulation results indicate that the scheme eliminates false data injected by multiple compromised nodes and also offers higher network lifetime in homogeneous wireless sensor networks.
    Keywords: Aggregator Node Selection; Chebyshev polynomial; False data detection; Network Lifetime; Pair-wise key establishment; Homogeneous wireless sensor networks.

  • Health Data Analytics using Scalable Logistic Regression with Stochastic Gradient Descent   Order a copy of this article
    by Gunasekaran Manogaran, Daphne Lopez 
    Abstract: As wearable medical sensors continuously generate enormous data, it is difficult to process and analyze. This paper focuses on developing scalable sensor data processing architecture in cloud computing to store and process body sensor data for health care applications. Proposed architecture uses big data technologies such as Apache Flume, Apache Pig and Apache HBase to collect and store huge sensor data in the Amazon Web Service. Apache Mahout Implementation of MapReduce based online stochastic gradient descent algorithm is used in the logistic regression to develop the scalable diagnosis model. Cleveland Heart Disease Database (CHDD) is used to train the logistic regression model. Wearable body sensors are used to get the blood pressure, blood sugar level and heart rate of the patient to predict the heart disease status. Proposed prediction model efficiently classifies the heart disease with the accuracy of training and validation sample is 81.99% and 81.52% respectively.
    Keywords: Stochastic Gradient Descent; MapReduce Logistic Regression; Apache Flume; Apache HBase; Apache Pig; Hadoop Distributed File System; Wearable Medical Sensor; Body Sensor; Clinical Data; Big Data Analytics; Health Care; Cloud Computing; Cleveland Heart Disease Database; Amazon Web Service; Ubiquitous Computing.

    by Sakthi Ganesh 
    Abstract: The main objective behind designing this AITPMS is safety of vehicle which leads towards the human life safety. System indicates current tire pressure of individual tire to the vehicle operator wirelessly. It maintains accuracy of tire pressure during running condition of vehicle. AITPMS is designed to display real time tire pressure and to give alert at low pressure. This system will help to built smart vehicle along with that it leads to built fuel efficient vehicle. It reduces tire related accidents by checking & transmitting information of tire pressure continuously. In previous AITPMS there is unnecessary power consumption in the sensor unit because generation of RF signals continuously accordingly to data from pressure sensor at a regular interval. Previous AITPMS is designed in such a way that it will transmit the data when threshold condition is achieved, some systems are proposed with long life battery because in conventional AITPMS there is always problem with battery replacement at the transmitter section, in this proposed work to overcome stated drawback of replacement of battery in RFID & other technologies which are used for wireless communication. In case of Ambient Backscatter technology dedicated power supply is not needed to generate RF signal required to send message i.e. this is battery free technology is used to perform wireless communication.
    Keywords: AITPMS system; RF signal; Pressure sensor; Transmitter section; Receiver section; Ambient Backscatter; RF Energy Harvesting system; Microcontroller.

  • A framework to mitigate ARP Sniffing attacks by Cache Poisoning   Order a copy of this article
    by Prabadevi Boopathy, Jeyanthi N 
    Abstract: Today in the digital era of computing most of the network attacks are caused by sniffing the sensitive data over the network. Of various types of sniffing attacks, ARP sniffing causes most of the LAN attacks (wired and wireless LAN coexist). ARP sniffing causes poisoning of ARP cache or spoofing. Through ARP sniffing the attacker tries to know the (IP, MAC) pair of victims system available in ARP table/ or ARP request-reply packet passed over the network and either exploits victims resources or creates a situation to deny victims services for its legitimate users. This in-turn causes MITM, DoS or DDoS attacks. The major cause for these attacks is lack of effective authentication mechanisms with ARP or RARP protocols used for address resolution. This paper provides the working of ARP protocol and a method to mitigate the attacks caused by ARP cache poisoning. The proposed framework compares the IP-MAC pair in the ARP and Ethernet headers and if any fake entry is suspected, the information is updated in the fake_list and a message is sent to the gateway or router to alert it from cache poisoning attacks.
    Keywords: ARP cache poisoning; address resolution;Man-in-the-middle attacks;host impersonation.

  • A literature survey on the performance evaluation model of semantics enabled web services   Order a copy of this article
    by Shri Devi, Raju G 
    Abstract: Semantically annotating web services is gaining a lot of attention as a crucial facet to support the automatic matchmaking and composition of web services. Therefore, the support of well-known and agreed ontologies and tools for the semantic annotation of web services is changing into a key concern to assist the diffusion of semantic web services. The objective of this literature review is to summarize the present progress for supporting the annotation of web services and the performance evaluation of such annotated services by providing answers to related research queries. The review follows a predefined procedure that involves automatically searching well-known dig-ital libraries. Our review identified some approaches available for semantically annotating functional and non-functional aspects of web services. However, many of the approaches are either not validated or the validation done lacks credibility. We believe that a substantial amount of work remains to be done to improve the current state of research in the area of supporting semantic web services and also in evaluating the performance of annotated web services.
    Keywords: Ontologies; Semantic web services; Functional and non-functional aspects Systematic literature review.

  • Investigation on Different Clustering Techniques in Wireless Sensor Networks   Order a copy of this article
    by Prabu S, Meenatchi Shanmugam 
    Abstract: In the recent years have witnessed hyperbolic interest within the potential use of wireless sensor networks (WSNs) in an exceedingly wide selection of applications like area observation, health care observation, pollution observation, fire detection, landslide detection, natural disaster hindrance, Machine health observation and etc., its become a hot research space. To support high scalability and higher knowledge aggregation, sensor nodes are typically sorted into disjoint, non overlapping subsets known as clusters. Clusters produce stratified WSNs that incorporate economical utilization of restricted resources of sensor nodes and so extends network period. Specifically, well examine the performance in terms of the ability and quality aspects of those schemes. We tend to conjointly discuss enhancements to be created for future planned cluster schemes. Finally, we tend to summarize and conclude the paper with some future directions.
    Keywords: Wireless Sensor Networks; Clustering ,Energy Efficient Clustering; Network Lifetime.

  • Experimental Analysis of Impact of Term Weighting Schemes on Cluster Quality   Order a copy of this article
    by Hannah Grace, Kalyani Desikan 
    Abstract: Text clustering divides a set of texts into clusters such that texts within each cluster are similar in content. It may be used to uncover the structure and content of unknown text sets as well as to give new perspectives on familiar ones. The main goal of document clustering is to find meaningful groups so that the analysis of all the documents within clusters is much easier compared to viewing them as a whole collection. Only certain terms extracted from a document can be used for identifying and scoring a document within the collection. Term weighting schemes are used to identify the importance of each term with respect to a collection and assign weights to them accordingly. Document clustering uses these term weights to compare the similarity between documents. Several term weighting schemes are in use today, but none of them are specific to the clustering algorithms. The term frequency based clustering techniques consider the documents as a bag-of words while ignoring the relationship between the words. So, in this paper we focus our analysis on different term weighting schemes such as term frequency (tf), term frequency- inverse document frequency (tfidf), Augmented Term Frequency (ATC) without normalization and Augmented Term Frequency-inverse document frequency (ATCidf). In this paper we have used the clustering tool CLUTO to experimentally study the impact of term weighting schemes on the quality of the clustering solution obtained by applying the Repeated Bisection Partitional Algorithm. We have used the I2 criterion function available in CLUTO for computing the similarity between documents using term weighting schemes such as tf, tfidf, ATC without normalization and ATCidf.
    Keywords: document clustering; term weighting scheme; cluster quality; criterion functions; entropy; purity.

  • Fault Tolerant Big Bang-Big Crunch for Task Allocation in Cloud Infrastructure   Order a copy of this article
    by Punit Gupta, Satya Prakash Ghrera 
    Abstract: Cloud computing is now an industrial standard for large-scale computing and solving problems with high reliability. This has been accepted by companies worldwide like Google, Microsoft, apple for resource computing and resource sharing. But as the number of request over the data centers in cloud increases, load and failure probability over a data center increases. So the requests need to be balanced in such an efficient manner which having more effective strategy for resources utilization, request failure and improved system reliability. Moreover a survey on cloud computing shows that failure probability increases if the load over the distributed independent resources increases. So to overcome these issues in cloud Infrastructure as a service (IaaS), we have proposing a learning based fault aware big bang-big crunch algorithm for task allocation to minimize the request failure and improve QoS (Quality of Service) over a data center. Proposed algorithm has been inspired from theory of evolution in astrology. Proposed strategy has proven to have better performance in term of execution time, scheduling time and request failure rate as compared to previously proposed task allocation algorithm.rn
    Keywords: Cloud computing; QoS; Resource utilization; Failure probability; Reliability; Cloud Infrastructure as a service; Makespan.rn.

  • Prolonging the Network life in Wireless Sensors Network - using Refined Region of Interest (RROI)   Order a copy of this article
    by Pritee Parwekar, Sireesha Rodda 
    Abstract: Wireless Sensor Networks are testing new domains with increasingly new applications. Resource constraints has been the classic problem associated with these networks and maximizing the network life without compromising on the efficacy of the network is the focus of every research endeavor. Considering the relevance of data, this paper talks about a concept of refining the region of interest and concentrating the network resources in such area to optimize the network life without loosing out on the relevant data with adequate resolution. Field trials using limited sensors have been undertaken to validate the idea of refined region on interest. The concept has helped increase the network life compared to its traditional equivalent.
    Keywords: Wireless Sensor Networks; internet of things; region of interest; energy conservation; optimum protocol.

  • Adaptive Type-2 Fuzzy Controller for a Nonlinear Delay Dominant MIMO Systems: An Experimental Paradigm in LabVIEW   Order a copy of this article
    by M. Kalyan Chakravarthi, Nithya Venkatesan 
    Abstract: Higher order non linear systems with prevailing delay have been very challenging in significance with stability and process performance. This paper investigates the performance of a Type-2 Mamdani intelligent controller implemented a delay dominant system which is modelled using black box approach identified to be a second order non linear model for a Dual Spherical Tank Liquid Level System (DSTLLS) under LabVIEW environment. The adaptive approach of the intelligent Mamdani based fuzzy controller proves itself to be very competent compared with the earlier experimented methods that already exist. The performance indices like Integrated Absolute Error (IAE) and Integrated Squared Error (ISE) are also calculated for different varying set point changes of DSTLLS. The response and error reduction efficiency of this method of Adaptive Type-2 Intelligent Fuzzy (ATIF) Controller has been experimented for different flow configurations of this DSTLLS, like Multiple Input Single Output (MISO), Multiple Input Multiple Output (MIMO) and Single Input Single Output (SISO).
    Keywords: Non linearity; Mathematical Modelling; MIMO systems; Fuzzy Controllers; Mamdani.

  • Round Estimation Period for cluster based routing in Mobile Wireless Sensor Networks   Order a copy of this article
    by Maryam El Azhari 
    Abstract: The recent technological advances in digital electronic and robotics manufacturing has enabled the evolvement of sensors to an upper level. Sensor nodes can be changing their locations according to the zone of coverage. The overall set of scattered sensor nodes forms a particular type of wireless communication networks, named Mobile Wireless Sensor Networks. This type of wireless networks is widely used in many applications including environmental, healthcare and military applications. The mobility constraint when added to a communication process, brings in a number of challenges. The reliability constitutes a major problematic to take into account. In this paper, we proposed a new technique to enhance the performance of cluster based routing protocol for MWSNs. A probabilistic approach is used to balance the reconfiguration frequency of cluster forming for data transmission within a mobile environment. The results proved the efficiency of our technique, as it increased the performance of cluster based routing protocols in terms of energy consumption, end to end delay and throughput.
    Keywords: Mobility; Cluster Based Routing; Sensor Networks; MobileWirelessrnSensor Networks; routing protocols; Cluster Head; LEACH; poison distribution.

  • Energy Efficient Virtual Machine Consolidation for Cloud Data Centers Using Analytic Hierarchy Process   Order a copy of this article
    by Oshin Sharma, Hemraj Saini 
    Abstract: Ever increasing demand of a cloud computing immensely increases the consumption of energy and power. Data centers consume 1.1% to 1.5% of overall electricity consumed in the world which is growing by 12% every year. There are many electric components present inside it and therefore, it needs a huge amount of electricity to power and cool down the electric components which results in emissions of high carbon dioxide. Minimization of energy consumption in the data centers is very important for environmental sustainability and it can be minimized by using lesser number of cloud resources and improving the utilization of these resources. Dynamic consolidation of VMs (virtual machines) plays an important role and an effective method for the reduction of energy consumption. Consolidation of VMs can be done on the basis of CPU utilization, memory occupied by the VM, migration time taken by VMs from one host to another. Along with this, correlation policy and by switching the mode of idle servers to sleep or hibernate or switching them off can also reduce energy and power consumption. In the current study a peculiar technique based on Analytic Hierarchy process for the selection of VM for migration has been proposed to minimize the total energy consumption and SLA violation of cloud environment. Results obtained from CloudSim toolkit using PlanetLab data set revealed that the proposed approach was found better for energy saving and QOS metrics of cloud data centers as compared to conventional techniques.
    Keywords: VM consolidation; VM Migration; Energy consumption; Analytic Hierarchy Process; Cloud computing; VM selection.