Forthcoming articles


International Journal of Computer Aided Engineering and Technology


These articles have been peer-reviewed and accepted for publication in IJCAET, but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.


Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.


Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.


Articles marked with this Open Access icon are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.


Register for our alerting service, which notifies you by email when new issues of IJCAET are published online.


We also offer RSS feeds which provide timely updates of tables of contents, newly published articles and calls for papers.


International Journal of Computer Aided Engineering and Technology (140 papers in press)


Regular Issues


  • SysML Model-Driven Approach to Verify Blocks Compatibility   Order a copy of this article
    by Hamida Bouaziz, Samir Chouali, Ahmed Hammad, Hassan Mountassir 
    Abstract: In the component paradigm, the system is seen as an assembly of heterogeneous components, where the system reliability depends on these components compatibility. In our approach, we focus on verifying compatibility of components modelled with SysML diagrams. Thus, we model component interactions with sequence diagrams (SDs) and components with SysML blocks. The SDs constitute a good start point for compatibility verification. However, this verification is still inapplicable directly on SDs, because they are expressed in informal language. Thus, to apply a verification method, it is necessary to translate the SDs into formal models, and then verify the wanted properties. In this paper, we propose a high-level model-driven approach which consists of an ATL grammar that automates the transformation of SDs into interface automata. Also, to allow an easy use of Ptolemy tool to verify properties on automata, we have proposed some Acceleo templates, which generate the Ptolemy entry specification.
    Keywords: model-driven; SysML; sequence diagram; interface automata; ATL; Acceleo.

  • On Simple Adaptive Control of Plants not Satisfying Almost Strict Passivity and Positivity Conditions: An Introduction to Parallel Feedforward Configuration   Order a copy of this article
    by Khalil Mokhtari, Mourad Abdelaziz 
    Abstract: Simple adaptive control systems were known to be robust against a class of disturbances and globally stable if the controlled plant is almost strictly positive real (ASPR), that is, if there exists a positive definite static output feedback (unknown and not needed for implementation) such that the resulting closed-loop transfer function is strictly positive real (SPR). The present paper discusses the simple adaptive control scheme for a non-Almost Strictly Positive Real Plants and gives a brief review of the parallel feedforward which makes the augmented plant satisfy the almost passivity or positivity conditions based on the stabilizability property of the system. The validity of the simplified adaptive algorithm under the positivity condition is examined through numerical simulation for both single-inputs single-outputs (SISO) and multi-inputs multi-outputs (MIMO) systems.
    Keywords: Simple Adaptive Control; Almost Strictly Positive Real; Parallel Feedforward Configuration;.

  • Optimization of Speed Control for Switched Reluctance Motor using Matrix Converter   Order a copy of this article
    by Sridharan Subbiah, Sudha S 
    Abstract: This paper proposes a new idea for converter based speed control optimization for Switched Reluctance Motor. The main objective of the speed control technique is reduced torque ripples and improved control performance achieved using matrix converter based PID controller. Speed control performance optimization is attained using Particle Swarm Optimization algorithm. The potential benefits of matrix converter are flexible current profile and lowest switching frequency, Hence minimal loss. Simulation and analysis has been evaluated using MATLAB / Simulink with the reference speed 1500 rpm within the time frame of 0.045 seconds.
    Keywords: EMC – Embedded Micro Controller; PID- Proportional Integral and Derivative; SRM- Switched Reluctance Motor; NPSO- New Particle Swarm Optimization.

  • A New Robust Fuzzy-PID Controller Design Using Gravitational Search Algorithm   Order a copy of this article
    by Nour E.L. Yakine Kouba, Mohamed Menaa, Mourad Hasni, Mohamed Boudour 
    Abstract: This paper proposes the design of a novel robust Load Frequency Control (LFC) strategy based optimised fuzzy-PID controller employing Gravitational Search Algorithm (GSA). The suggested GSA algorithm was applied to optimise the input scaling factors of the fuzzy logic and the PID controller gains. To show the potential of the proposed control methodology, a multi-sources two-area interconnected power system was investigated for the simulation. The considered test system comprises various power generating units from hydro, thermal and nuclear sources in Area-1, and power generation from hydro, nuclear and diesel sources in Area-2. Initially, the simulation was carried considering a centralised controller for both areas to cope with load changes, and then was extended with decentralised controller. Further, sensitivity analysis was performed to demonstrate the ability of the proposed approach in face of wide changes in system parameters and position of load changes. The frequency deviations and the tie-line power flow change were presented, and the superiority of the proposed control strategy was demonstrated by comparing the results with individual Gravitational Search Algorithm (GSA), Fuzzy Logic Control (FLC) and with some reported techniques in the literature such as Ziegler-Nichols, Genetics Algorithm (GA), Bacterial Foraging Optimisation Algorithm (BFOA) and Particle Swarm Optimisation (PSO).
    Keywords: Ancillary Frequency Control; Load Frequency Control (LFC); PID Controller; Fuzzy Logic Controller (FLC); Optimal Control; Gravitational Search Algorithm (GSA).
    DOI: 10.1504/IJCAET.2019.10017529
  • A Formal Framework based K-Maude for Modeling Scalable Software Architectures   Order a copy of this article
    by Sahar Smaali, Aïcha Choutri, Faïza Belala 
    Abstract: Dynamic software architecture (DySA) is one of the most important challenges and curcial problems for managing dynamically scalable software evolution. In this paper, we propose DySAL, a DySA specific modeling language, primarily based on interfaces, dealing with both evolution types: spatial reconfiguration at system level and architectural elements changes (types and behavior) at architecture level. DySAL combines MDE and K-Maude techniques for modeling different DySA aspects (topology, behaviour, interactions, and dynamic evolution). It is defined according to an incremental and multi-level approach and may benefit from the strength of the used approaches: meta-modeling (MDE) and formal semantics (Maude). This has the advantage to manage all types of dynamic evolution and their eventual side effects. Moreover, DySAL facilitates DySA definition with multiple views while covering all phases of dynamic application development life cycle, including validation and verification. Our approach is illustrated and evaluated through a realistic example of a ubiquitous system.
    Keywords: Scalable software architecture; Dynamic evolution; DySAL; MDE; K-Maude framework; Formal semantics.

  • Design Optimization of Cutting Parameters for a Class of Radially-Compliant Spindles via Virtual Prototyping Tools   Order a copy of this article
    by Giovanni Berselli, Marcello Pellicciari, Gabriele Bigi, Roberto Razzoli 
    Abstract: The widespread adoption of Robotic Deburring (RD) can be effectively enhanced by the availability of methods and integrated tools capable of quickly analyzing the overall process performance in a virtual environment. On the other hand, despite the current availability of several CAM tools, the tuning of an RD process parameters is still mainly based on several physical tests, which drastically reduce the robotic cell productivity. The reason is that several potential sources of errors, which are unavoidable in the physical cell, are simply neglected in state-of-the-art CAM software. For instance, the effectiveness of a RD process is highly influenced by the limited accuracy of the robot motions and by the unpredictable variety of burr size/shape. In most cases, it is strictly necessary to maintain a uniform contact pressure between the tool and the workpiece at all times, despite the burr thickness, so that either an active force feedback or a passive compliant spindle must be employed. Focusing on the latter solution, the present paper proposes a Virtual Prototype (VP) of a radially-compliant spindle, suitable to quickly assess deburring efficiency in different case scenarios. The proposed VP is created by integrating a 3D multi-body model of the spindle mechanical structure with the behavioural model of the process forces. Differently from previous literature and from state-of-the-art CAM packages, the proposed VP allows to quickly estimate the process forces (accounting for the presence of workpiece burr and tool compliance) and the optimal deburring parameters, which are readily provided as contour maps of the envisaged deburring error as function of the cutting parameters. As an industrial case study, a commercial compliant spindle is considered and numerical simulations are provided, concerning the prediction of the surface finishing accuracy for either optimal or sub-optimal parameter tuning.
    Keywords: Virtual Prototyping; Parameter Design; Robotic Deburring; Passively Compliant Spindle.

    by Koussaila Iffouzar, Mohamed Fouad BENKHORIS, Haroune AOUZELLAG, Kaci GHEDAMSI, Djamal AOUZELLAG 
    Abstract: The study of the behavioral analysis of the dual star induction machine fed by voltage inverters is presented in this article. One of the drawbacks of the DSIM supply via PWM voltage inverters is the occurrence of harmonic currents circulating large amplitude; these late induce leaking in the machine due to the chopping frequency of the inverter. Based on advanced dynamical model and equivalent circuit of DSIM developed in this paper, the study and the analysis of the DSIM fed by PWM inverter is done. The impact of the increase in levels of the voltage inverter controlled with natural PWM is studied. Minimization of the circulating currents between two stars with the technique of space vector PWM is analyzed, using the SVPWM simplify of three level NPC inverter will relieve the simulation and thereby obtain the effect of this technique on torque ripples and quality of energy in the machine.
    Keywords: Dual stars induction machine; multilevel voltage source inverter; vector control; dynamical modelling; behavior analysis; current quality; torque ripples.
    DOI: 10.1504/IJCAET.2019.10012312
  • Investigating The Applicability of Generating Test Cases for Web Applications Based on Traditional Graph Coverage   Order a copy of this article
    by Ahmad A. Saifan, Mahmoud Bani Ata, Bilal Abul-Huda 
    Abstract: Web applications provide services to hundreds of millions of people throughout the world. However, developers face a range of problems and challenges in testing them, including the fact that web applications run on diverse and heterogeneous platforms and are written using diverse programming languages. Moreover, they can be dynamic, with their contents and structures determined by inputs from users, so they need to be tested to ensure their validity. In this paper we investigate the ability to generate a set of test cases for web applications based on traditional graph coverage criteria. First, we extracted the in-link and out-link from given web applications in order to draw a web graph, before extracting the prime paths from the graph. After that, the invalid transitions were built from the prime paths. Finally, all the invalid transitions were extended with valid transitions. We evaluated our investigation process by using different sizes of web applications. Two cases studies were used in this paper, the first a small size application and the second a medium size. The results show how difficult it is to run a huge number of test cases generated manually using graph coverage criteria, even for a small web application.
    Keywords: Web testing; graph coverage criteria; prime paths; invalid transitions; invalid paths; test case generation;.

  • A Fuzzy Rule Based Approach for Test Case Selection Probability Estimation in Regression Testing   Order a copy of this article
    by LEENA SINGH, Shailendra Narayan Singh, Sudhir Dawra, Renu Tuli 
    Abstract: Regression testing is a very essential activity during maintenance of software. Due to constraints of time and cost, it is not possible to re-execute every test case with respect to every change occurred. Thus, a technique is required that selects and prioritizes the test cases efficiently. This paper proposes a novel fuzzy rule based approach for selecting and ordering a number of test cases from an existing test suite to predict the selection probability of test cases using multiple factors. The test cases, which have ability to find high fault detection rate with maximum coverage and minimum execution time to test are selected. The results specify the effectiveness of the proposed model for predicting the selection probability of individual test cases.
    Keywords: Prioritization; regression testing; selection probability; fuzzy rule.
    DOI: 10.1504/IJCAET.2020.10016172
  • Base station Placement Optimization Using Genetic Algorithms Approach   Order a copy of this article
    by Ouamri Mohamed Amine, Abdelkrim KHireddine 
    Abstract: The base station (BS) placement, or planning cell problem, involves choosing the position and infrastructure configuration for cellular networks. This problem is considered to be a mathematical optimization problem and will be optimized in our study using genetic algorithms. The various parameters such as site coordinates(x,y), transmitting power, height and tilt are taken as design parameters for BS placement. This paper takes signal coverage, interference and cost as objective functions and handover, traffic demand and overlap as a very important constraint. Receiving field strength testing services for all items is calculated using simulations and path loss is calculated using Hata model. Assuming that a flat area is considered, the performance of the proposed algorithm was evaluated with 97% of the users in the network being covered with a good quality signal.
    Keywords: Base Station; Network Planning; Antenna; Propagation model; Genetic Algorithms; WSM (Weighted Sum Method).
    DOI: 10.1504/IJCAET.2020.10006440
  • Enhanced Approach for Test suite Optimization Using Genetic Algorithm   Order a copy of this article
    by Manju Khari 
    Abstract: The software is growing in size and complexity every day due to which strong need is felt by the research community to search for the techniques which can optimize test cases effectively. Search based test cases optimization has been a key domain of interest for the researchers. Test case optimization techniques selectively pick up only those test cases from the pool of all available test data which satisfies the predefined testing criteria. The current study is inspired by the ants and genetic behaviour of finding paths for the purpose of finding good optimal solution. The proposed algorithm is GACO algorithm, the Genetic Algorithm (GA) and Ant Colony Optimization (ACO) is used to find a suitable solution to solve optimization problems. The performance of the proposed algorithm is verified on the basis of various parameters namely running time, complexity, efficiency of test cases and branch coverage. The results suggest that proposed algorithm is significantly average percentage better than ACO and GA in reducing the number of test cases in order to accomplish the optimization target. The inspiring result raises the need to carry out future work.
    Keywords: Bio Inspired Computation; Genetic; Ant Colony optimization; Fitness function.

  • Cost Minimization Technique in Geo-Distributed Data Centers   Order a copy of this article
    by Ayesheh Ahrari Khalaf 
    Abstract: Significant growth of Big Data leads to a great opportunity for data analysis. Data centers are continuously becoming more popular. At the same time data centers cost are increasing as the amount of data is growing. Simply as Big Data is significantly increasing, data centers are facing new challenges. Hence the idea of geo-distributed data center is introduced. This project investigates on the main challenges that data centers face and presents an enhanced technique for cost optimization in geographical distributed data centers. Parameters involved such as task assignment, task placement, big data processing and quality of service are analyzed. Analytical evaluation results show that joint parameters technique proposed outperformed separate parameter techniques in some cases even with 20 percent enhancement. Academic Gurobi solver is used for the evaluation.
    Keywords: Cloud computing; data flow; data placement; geo-distributed data centers; cost minimization; task assignment.

  • Hall effects on MHD flow of a Visco-elastic fluid through a porous medium over an infinite oscillating plate with Heat source and Chemical reaction   Order a copy of this article
    by Mangali Veera Krishna 
    Abstract: In this paper, we have considered the unsteady flow of an incompressible visco-elastic liquid of the Walter 𝐵 model with simultaneous heat and mass transfer near an oscillating porous plate in slip flow regime taking hall current into account. The governing equations of the flow field are solved by a regular perturbation method for small elastic parameter. The expressions for the velocity, temperature, concentration have been derived analytically and also its behaviour is computationally discussed with reference to different flow parameters with the help of graphs. The skin friction on the boundary, the heat flux in terms of the Nusselt number, and the rate of mass transfer in terms of the Sherwood number are also obtained and their behaviour discussed.
    Keywords: Heat and mass transfer; Hall effects; MHD flows; porous medium; unsteady flows and visco-elastic fluids.

  • Optimized Adaptive Speech Coder for Software Defined Radio   Order a copy of this article
    by Sheetal Gunjal, Rajeshree Raut 
    Abstract: In this paper, use of Discrete Wavelet Transform (DWT) along with Discrete Cosine Transform (DCT) is proposed to exploit speech coding parameters such as bit rate, compression ratio, delay and quality so as to fit the proposed coder in the family of existing speech coders. The proposed coding technique is applied on different speech signals with fix frame size and desired bit rates. The obtained simulation result shows that the proposed coding technique outperforms in compression ratio with compatible processing delay. The Mean Opinion Score (MOS) assessment shows its effective working at different bit rates (13Kbps to 256 Kbps). The coder is also tested successfully on ARM 9 based Software Defined Radio (SDR) platform at different frequency bands with desired bit rates. Hence, the coder may be considered as ʺone size fits allʺ type of coder for efficient utilization of available frequency spectrum in mobile communication.
    Keywords: DCT; DWT; Software Defined Radio;.

  • A smooth three-dimensional reconstruction of human head from minimally selected computed tomography slices   Order a copy of this article
    by Haseena Thasneem, Mohamed Sathik, Mehaboobathunnisa R 
    Abstract: Three-dimensional reconstruction has been deeply investigated by researchers all over the world. This is a comprehensive effort to find an effective interpolation technique which can provide an accurate and enhanced three-dimensional reconstruction of human head from a select set of computed tomography slice data. Based on structural similarity measurement, a set of slices is selected and segmented using phase field segmentation. Keeping these segmented slices as base, the intermediate slices are re-created using linear and Modified Curvature Registration based interpolation and the results are compared. To further enhance the result and provide a better reconstruction, we apply a refinement process using modified Cahn-Hilliard equation to the interpolated slices. The results are validated both quantitatively and qualitatively. Results show that Modified Curvature Registration based interpolation with our proposed refinement outperforms linear interpolation with refinement providing a simultaneous improvement in sensitivity (95.95%) and specificity (95.94%) with an accuracy of more than 96% and minimal mean square error.
    Keywords: structural similarity measure; phase field segmentation; curvature registration based interpolation; three dimensional reconstruction; computed tomography head slices.

  • Intelligent Mobile Robot Navigation Using a Neuro-Fuzzy Approach   Order a copy of this article
    by Somia Brahimi, Ouahiba Azouaoui, Malik Loudini 
    Abstract: This paper introduces an intelligent navigation system allowing a car-like robot to attain its destination autonomously, intelligently and safely. Based on a Neuro-Fuzzy (FNN) approach, the applied technique permits the robot to avoid all encountered obstacles and seek for its target's location in a local manner referring to the concepts of learning and adaptation. It uses two Fuzzy Artmap neural networks, a Reinforcement trial and error neural network and a Mamdani fuzzy logic controller (FLC). Experimental results in the Generator of modules (GenoM) robotics architecture, in an unknown environment, shows the FNN effectiveness for the autonomous mobile robot Robucar.
    Keywords: Mobile robots; autonomous systems; intelligent navigation; fuzzy logic; neural networks; obstacle avoidance; targets seeking; Fuzzy Artmap; Mamdani model.

  • Effect of algorithm parameters in development of spiral tool path for machining of 2.5D star-shaped pockets   Order a copy of this article
    by Divyangkumar Patel, Devdas Lalwani 
    Abstract: 2.5D pocket milling, which is used for manufacturing of many mechanical parts, is one of the main operations as compared to other milling operations and is extensively used in aerospace, shipyard, automobile, dies and molds industries. In machining of 2.5D pockets, directional parallel tool-path and contour parallel tool-path are widely used. However, these tool paths significantly limit the machining efficiency in terms of machining time, surface finish and tool wear because of repeated machining direction alteration, stop-and-go motion, sharp velocity discontinuity, and frequent repositioning, retraction, acceleration and deceleration of a tool. In the present work, to overcome the above mentioned problems, an attempt has been made to generate a spiral tool path for machining of 2.5D star-shaped pocket. The successful generation of spiral tool path depends on various algorithm parameters such as mesh size, permissible error and number of degree-steps. The effect of these parameters on spiral tool path generation is discussed and the best values are reported. The spiral tool path is developed using second order elliptic partial differential equation (PDE) and it is free from sharp corners inside the pocket region. The developed algorithm is formulated and presented in steps using MATLAB
    Keywords: Pocket Machining; Spiral Tool Path; High Speed Machining (HSM); Partial Differential Equation (PDE); Star-shaped Pocket.
    DOI: 10.1504/IJCAET.2020.10011055
  • Automatic Generation of Agent-based Models of Migratory Waterfowl for Epidemiological Analyses   Order a copy of this article
    by Dhananjai Rao, Alexander Chernyakhovsky 
    Abstract: Seasonal migration of waterfowl, in which avian influenza viruses are enzootic, plays a strong role in the ecology of the disease and has been implicated in several zoonotic epidemics and pandemics. Recent investigations have established that with just 1 mutation current avian influenza viral strains gain the ability to be readily transmitted between humans. These investigations further motivate the need for detailed analysis, in addition to satellite surveillance, of migratory patterns and its influence on the ecology of the disease to aid design and assessment of prophylaxis and containment strategies for emergent epidemics. Accordingly, this paper proposes a novel methodology for generating a global agent-based stochastic epidemiological model involving detailed migratory patterns of waterfowl. The methodology transforms Geographic Information Systems (GIS) data containing global distribution of various species of waterfowl to generate metapopulation for agents that model collocated flocks of birds. Generic migratory flyways are suitably adapted to model migratory flyways for each waterfowl metapopulation. Migratory characteristics of various species are used to determine temporal attributes for the flyways. The resulting data is generated in XML format compatible with our simulationbased epidemiological analysis environment called SEARUMS. Case studies conducted using SEARUMS and the generated models for high-risk waterfowl species indicate good correlation between simulated and observed viral dispersion patterns, demonstrating the effectiveness of the proposed methodology.
    Keywords: Migratory Flyways; Tessalation; Agent-based Modeling; Simulation; Computational Epidemiology; Avian Influenza (H5N1).

  • A New High Performance Empirical Model for software Cost Estimation   Order a copy of this article
    by H. Parthasarath Patra 
    Abstract: A software project can be successful when it is delivered on time, within the budget and maintaining the required quality as per client requirement. But in today's software industry, cost estimation is a critical issue for modern software developers. To estimate the effort and cost is significantly difficult and a challenging task. Since last 20 years more than 30 models are already developed to estimate the effort and cost for the betterment of software industry. But these algorithms cannot satisfy the modern software industry due to the dynamic behavior of the software for all kind of environments. On this study an empirical high performance interpolation model is developed to estimate the effort of the software projects. This model compares with the COCOMO based equations and predicts its result analyzing individually taking different cost factors. The equation consists one independent variable (KLOC) and two constants a, b which are chosen empirically taking different NASA projects historical data and the results viewed in this model are compared with COCOMO model with different scale factor values.
    Keywords: Kilo Lines of code; Software cost estimation; MRE; MMRE; PRED.

  • Model-Driven Development of Self-Adaptive Multi-Agent Systems with Context-Awareness   Order a copy of this article
    by Farid Feyzi 
    Abstract: In recent years, there has been an increasing interest in distributed and complex software systems which are capable of operating in open, dynamic and heterogeneous environments, and are required to adapt themselves to cope with environmental or contextual changes. In order to achieve or preserve the specific design objectives, such systems need to operate in an adaptive manner. Self-adaptive systems have the capability to dynamically modify their behavior at run-time in response to different kinds of changes. This paper presents a methodology to develop context-aware self-adaptive software systems by attempting to employ the model driven architecture (MDA) and agent-oriented technology advantages. The approach aims to combine these two promising research areas in order to overcome the complexity associated with the development of these systems and improve the quality and efficiency of the development process. The methodology focuses on the key issues in the analysis and design of self-adaptive multi-agent systems. Different abstraction levels based on MDA has been proposed and mappings between models in these levels provided. These mappings bridge the gap between the high-level models produced in computation independent (CIM) and platform independent models (PIM) as well as the low-level models based on specific implementation platform called SADE (Self-adaptation Development Environment). The proposed approach has been evaluated through a case study described in the paper.
    Keywords: Self-Adaptive System; Multi-Agent Systems; Self-* properties; Model-Driven Development.

  • Execution of UML based oil palm fruit harvester algorithm: novel approach   Order a copy of this article
    by Gaurang Patkar 
    Abstract: Farmers in rustic India have negligible access to rural specialists, who can investigate edit pictures and render counsel. Deferred master reactions to inquiries regularly achieve farmers past the point of no return. This review addresses the above issue with the target of building up another calculation to review Elaeis Guineensis types of palm natural product to help agriculturists and analysts. The structure outlined can unravel issues of human reviewing evaluating in light of two qualities and anticipate the rate of free unsaturated fat and oil content. Guidance can be rendered from best practices in light of this. After gathering agreement with agriculturists and starting examination it is discovered that alongside shading, the quantity of separated fruitlets additionally assumes significant part in reviewing. In the recently planned calculation both elements are mulled over for basic leadership. Since manual evaluating is inclined to blunder, the nature of oil expelled from substance is low. Hence, there is a need to outline calculation which is a structure for agriculturists and analysts. This structure can be utilized with any shading model in any ecological conditions. The computerization of the manual evaluating procedure is finished with the proposed Palm natural product Harvester calculation utilizing Unified Modeling Language chart (UML).
    Keywords: fruitlets; elaeis guineensis; modeling; elaeis guineensis; oil palm fruit; unified modeling langauage; free fatty acid.

  • A New Intelligent System for Glaucoma Disease Detection   Order a copy of this article
    by Mohamed El Amine Lazouni, Amel Feroui, Saïd MAHMOUDI 
    Abstract: Glaucoma is a redundant disease and a major cause of blindness resulting from damage in the optic nerve. Its major risk factor is increased intraocular pressure. This disease generally spreads very slowly and does not show any symptom at the beginning. The research presented in this paper is both a clinical and a technological aid for diagnosis of early glaucoma based on four different artificial intelligence classification techniques, which are: multi-layer perceptron, support vector machine, K-nearest neighbour and decision tree. A majority vote system was applied to these four artificial intelligence classification techniques in order to optimize the performances of th e proposed system. As far as the ratio cup to disc, which is one of the descriptors of the collected database, is concerned, we developed a non-supervised classification technique, which is the K-means algorithm for the detection of the cup, and another technique that is the Drainage divide algorithm (mathematical morphology method) for the detection of the disc. Moreover, we proposed a contour adjustment technique, which is the Ellipse Fitting method. We also applied a feature selection method (ReliefF) on our database in order to detect the pertinent descriptors ie those responsible for early glaucoma disease. The obtained results are satisfying, promising, and prove the efficiency and the coherence of our new database. They also were confirmed and validated by different doctors in ophthalmology.
    Keywords: Glaucoma; Classification; SVM; MLP; RBF; K-NN; Majority voting; ReliefF; Segmentation; LPE; K-Means; Ellips Fitting.

  • A BIM-based framework for construction project scheduling risk management   Order a copy of this article
    by F.H. Abanda 
    Abstract: The management of risks has been at the heart of most construction projects. Building Information Modelling (BIM) provides opportunities to manage risks in construction projects. However, studies about the use of BIM in risk management are sketchy with a lack of a systematic approach in using BIM for managing risk in construction projects. Based on existing risk models, this study investigated and developed a BIM-based framework for the management of construction project scheduling risk. Although, the frameworks were developed by mining risk management processes from Synchro and Vico, both being amongst leading 4D/5D BIM software systems, they can inform risk management in BIM projects that are supported by 4D/5D BIM software systems that contain risk management modules. The frameworks were validated for their syntactic and semantic correctness.
    Keywords: BIM; construction projects; risk; Synchro; Vico; 4D/5D BIM.

  • On The Order Reduction of MIMO Large Scale Systems Using Block-Roots of Matrix Polynomials   Order a copy of this article
    by Belkacem Bekhiti, Abdelhakim Dahimene, Bachir Nail, Kamel Hariche 
    Abstract: The present paper deals with the problem of approximating linear timerninvariant MIMO large scale systems with reduced order system via the help of thernso called Block-moment matching method based on the dominance exist betweenrnsolvents of the system characteristic matrix polynomial, where the Block-rootsrnare reconstructed using a new proposed procedure. The validation and study ofrnaccurate approximation is done by a specified performance index called pulsernenergy criterion. The necessary condition for correctness and applicability of the proposed method is the Block-controllablity or Block-observability. Finally, for the demonstration of the proposed method efficiency a numerical example is illustrated.
    Keywords: Solvents; Block-roots; Matrix polynomial; Moment matching; MIMO systems.

  • A combining technique based on channel shortening equalization for ultra wideband cooperative Systems   Order a copy of this article
    by Asma Ouardas, Sidahmed Elahmar 
    Abstract: This paper presents a novel combining technique based on the channel shortening approach for cooperative diversity in the context of time hopping ultra wideband (TH-UWB) systems. Since UWB channel has very long impulse response as compared to the very narrow pulse used system, TH-UWB performances are affected by inter-symbol interference (ISI). Therefore, the use of the Rake diversity combining is very effective, but it increases the receiver complexity due to its large number of correlations. The idea is to introduce a channel shortening equalizer (CSE) [named also Time domain equalizer (TEQ)] before the Rake reception in first and second time slots at the relay and destination, respectively. This proposed combination structure shows that there are great results in both decreasing the complexity of the receiver architecture by significantly reducing the number of effective channel taps and mitigating ISI. The Decode and Forward (DF) is used as a relay protocol to retransmit signals from the source to the destination and the relay is supposed equipped with multiple antennas and antenna selection criterion is used to exploit the diversity with reduced complexity. In the considered relay network, UWB links between the nodes are modeled according to IEEE 802.15.4a standards. The performance of the proposed structure is compared to cases where the relay is equipped with a single antenna and multiple antennas (full diversity). Numerical results show that significant improvement in the BER of UWB system is obtained by combining cooperative diversity technique and Channel shortening technique (lower than 〖10〗^(-5)) with respect to both improving the system performance and reducing the system complexity by using the antenna selection strategy which achieved the full diversity gain.
    Keywords: Time Hopping Ultra Wideband; TH-UWB; Channel Shortening Equalizer; RAKE receiver; Cooperative diversity; Antenna selection; Decode and Forward.

  • A Rewriting Logic Based Semantics and Analysis of UML Activity Diagrams: A Graph Transformation Approach   Order a copy of this article
    by Elhillali Kerkouche, Khaled Khalfaoui, Allaoua Chaoui 
    Abstract: Activity diagrams are UML behaviour diagrams which describe global dynamic behaviours of systems in a user-friendly manner. Nevertheless, UML notations lack firm semantics which make them unsuitable for formal analysis. Formal methods are suitable techniques for systems analysis. Rewriting Logic and its language Maude provides a powerful formal method with flexible and expressive semantics for the specification and the analysis of systems behaviour. However, the learning cost of these methods is very high. The aim of this paper is to integrate UML with formal notation in order to make the UML semantics more precise which allow rigorous analysis of its models. In this paper, we propose a graph transformation based approach to generate automatically Maude specifications from UML Activity diagrams. The proposed approach is automated using the AToM3 tool, and it is illustrated through an example.
    Keywords: UML Activity Diagrams; Rewriting Logic; Maude language; Meta-Modelling; Graph Grammars; Graph Transformation; AToM3.

  • Shape definition and parameters validation through sheet metal feature for CNC dental wire bending   Order a copy of this article
    by Rahimah Abdul Hamid, Teruaki Ito 
    Abstract: The present study is conducted to validate the calculated Computer-aided Manufacturing (CAM) data, or the bending code (B-code) according to the theory of 3D linear segmentation algorithm. The theory uses Cartesian coordinates of the segmented 3D lines and produces the desired bending parameters in terms of feeding length (L), the plane rotation angle (β) and bend angle (θ). The parameters are intended to control and drive the Computer Numerical Control (CNC) dental wire bending machine. Till recently, the wire bending operation in dentistry application is manually bent in both orthodontics and prosthodontics application. The study proposes the idea to automate the dentistry wire bending operation by means of a CNC desktop wire bender. For this reason, a theory of 3D linear segmentation is introduced and the recent work discusses the validation process of this approach. This paper aims to give an early theoretical result of the wire bending operation and does not consider material properties in the calculation. A reverse engineering is adopted where a pre-fabricated dental target shape is physically measured and re-designed. Sheet metal feature is used to virtually simulate the wire bending operation based on the theory and to show the procedure of translating the design into CAM data works well. As a result, the generated sheet metal bending parameters are analysed and compared with the calculated parameters. To conclude, the B-code for the wire bending mechanism has been validated in the present work.
    Keywords: Concurrent engineering; 3D linear segmentation; parameters validation; dental wire; bending code; reverse engineering; CAD/CAM.

    by T.R. Ganesh Babu, S. Nirmala, K. Vidhya 
    Abstract: To image trabecolectomy blebs using anterior segment optical coherence tomography AS-OCT and to measure the blebs morphological features such as bleb height area and extend.In this paper fuzzy local informationC-means clustering is used to segment the bleb boundary. A batch of 25 AS-OCT images are used to assess the performance of the determined parameters to the clinical parameters, and 91.43% accuracy is obtained in the determined parameters result.The mean value of bleb height, area and extend are 0.2 mm, 1.618 mm2, 0.343 mm respectively. The result showsthe potential applicability of the method for automated and objective mass screening for detection of bleb boundary.
    Keywords: Blebs; Fuzzy local information-c means clustering Trabeculectomy; Anterior chamber optical coherence tomography; median filter.

  • A Probabilistic Analysis of Transactions Success Ratio in Real-Time Databases   Order a copy of this article
    by Mourad Kaddes, Majed Abdouli, Laurent Amanton, Alexandre Berred, Bruno Sadeg, Rafik Bouaziz 
    Abstract: Nowadays, due to rapidly changing technologies, applications handling more data and providing real-time services are becoming more frequent. Realtime database systems are the most appropriate systems to manage these applications. In this paper, we study statistically the behavior of real-time transactions under the Generalized Earliest Deadline First scheduling policy (GEDF ). GEDF is a new scheduling policy in which a priority is assigned to a transaction according to both its deadline and a parameter which expresses the importance of the transaction in the system. In this paper, we focus our study on the influence of transactions composition. Precisely, we study the influence of transaction distribution on the system performances and on approximation of transactions success ratio behavior by a probability distribution. To this end, we have developed our RTDBS simulator and we have conducted intensive MonteCarlo simulations
    Keywords: Real-time databases system; Transactions; schedule; GEDF; Stochastics; Monte-carlo Simulation.

    by Arun Kumar M, Agilan P, Ramamoorthy S, MaheshKumar N 
    Abstract: In this paper, the authors investigate the general solution and generalized Ulam-Hyers stability of a n-dimensional additive functional equation with n > 2 in Banach spaces by applying direct and fixed point methods.
    Keywords: additive functional equation; fixed point Generalized Ulam-Hyers stability.

  • Optimistic and Pessimistic Solutions of the Fuzzy Shortest Path Problem by Physarium Polycephalum approach   Order a copy of this article
    by Renu Tuli, Vini Dadiala 
    Abstract: The remarkable behavior of Physarium polycephalum has been used to solve the fuzzy shortest path problem. A novel algorithm has been developed for varying degrees of optimism ranging from purely pessimistic to purely optimistic. Providing the decision maker (DM) a range of solutions gives him/her more flexibility in choosing the solution according to his/her degree of optimism. The triangular and trapezoidal fuzzy numbers representing cost or duration of travel are converted to crisp numbers by finding their total integral values and thereafter optimal solutions for varying degrees of optimism are obtained. The process is explained by four numerical examples including a tourist network problem and results obtained are compared with existing work. It has been observed that in comparison to the existing work, this method is not only easier to understand and implement but also gives better non-dominated optimal solutions.
    Keywords: Physarium polycephalum; triangular fuzzy numbers; trapezoidal fuzzy numbers; optimistic and pessimistic approachesrnrn.

  • Cryptographic Key Management Scheme for Supporting Multi-User SQL Queries over Encrypted Databases   Order a copy of this article
    Abstract: Database outsourcing is getting more popular bringing in a new standard, called database-as-a-service, where an organizations database is stored in cloud. In such a setting, both access control and data confidentiality plays an important role, particularly when a data owner likes to publish his data for external use. Any cloud provider promises the security of its platform, while the execution of solutions to ensure confidentiality of the data stored in cloud databases is left to the data owner. The state-of-the-art solutions deal few preliminary issues with aid of SQL queries on encrypted data. In this paper, we propose a novel cryptographic key management scheme that combines data encryption and key management and supports multi-user SQL queries over encrypted databases. Our approach shows the proposed solutions for enforcing access control and for ensuring confidentiality of data. The experimental results obtained in this paper show the performance of proposed scheme.
    Keywords: data confidentiality; access control; key derivation; encryption; metadata.

  • Employment Effects and Efficiency of Ports   Order a copy of this article
    by Torsten Marner, Matthias Klumpp 
    Abstract: Expected increasing transport volumes in Germany and Europe, combined with increasing sustainability requirements, lead to a prospective major role of sea and inland ports in future transport systems. But especially for inland ports this increased expectations more and more lead to conflicts regarding port property denomination as city development heavily pursues non-transport and non-industry dedications e.g. with high-scale living quarters, recreation and office space concepts like e.g. in D
    Keywords: Employment effects; inland ports; cost-benefit analysis; bottlenecks; freight transport performance; data envelopment analysis.

  • Evolutionary Neural Network Classifiers for Software Effort Estimation   Order a copy of this article
    by Noor Alhamad, Fawaz Alzaghoul, Esra Alzaghoul, Mohammed Akour 
    Abstract: The estimation of software development efforts has become a crucial activity in software project management. Due to this importance, many researchers focused their efforts on proposing models for relationship construction between efforts and software size and requirements. However, there are still gaps and problems in software effort's estimation process; due to the lack of enough data available in the initial stage of project life cycle. The need for an enhanced and an accurate method for software effort estimation is an urgent issue that challenged software project-management researchers around the world. This work proposes a model based on Artificial Neural Network (ANN) and Dragonfly Algorithm (DA), in order to provide more accurate model for software effort estimation. The applicability of the model was evaluated using several experiments and the results were in favour of the enhancement with more accurate effort estimation.
    Keywords: COCOMO 81; Artificial Neural Network; Dragonfly Algorithm; Effort estimation.

  • Fuzzy multi-objective approach based small signal stability analysis and optimal control of a PMSG based wind turbine   Order a copy of this article
    by Shubhranshu Mohan Parida, Pravat Kumar Rout, Sanjeeb Kumar Kar 
    Abstract: The objective of this manuscript is to design a controller to enhance the degree of stability through small signal analysis in case of a grid connected permanent magnet synchronous generator (PMSG) based wind turbine and to ensure an optimal set of control parameters to achieve an enhanced performance. The optimal control parameters are computed by optimizing the placement of system eigenvalues and net errors by formulating a fuzzy based multi-objective approach. The idea behind the formulation of the objective function through a multi-objective approach involves the association of error with relative stability of the system through computation of the real parts of eigenvalues. To find the optimal control gains, a two-fold mutation based differential evolution optimization is used. Results from a Matlab based model are presented for validation of the proposed technique to demonstrate the system stability when subjected to wind speed variation.
    Keywords: permanent magnet synchronous generator; PMSG; Small signal stability; SSS; wind energy conversion system; WECS; differential evolution; DE; wind turbine ;WT.

    by Srividya Venkataramani, Govindarajan  
    Abstract: Labeling in graph theory is an active area of research due to its wide range of applications . A graph labelling is an assignment of integers to the vertices or edges or both subject to certain conditions. This paper deals with one such labelling called odd harmonious labelling. A graph G = (V,E) with 'V(G)' = p and 'E(G)' = q is said to be odd harmonious if there exist an injection f: V(G)  {0,1,2,.2q-1} such that the induced function f* : E(G) {1,3,5..2q-1}defined by f*(uv) = f(u) +f(v) is bijective. In this paper we prove that every even cycle Cn (n≥6) with parallel P3 chords is odd harmonious. We also prove that the disjoint union of two copies of even cycle Cn(n≥6) with parallel P3 chords and the joint sum of two copies of even cycle Cn with parallel P3chords is odd harmonious. Moreover we show that the Chain of even cycles Cn(n≥6) with parallel P3 chords , joining 2 copies of even cycle Cn with parallel P3 chords by a path Ht of order t and also Dragons with parallel chords obtained from every odd cycle Cn (n≥7) after removing two edges from the cycle Cn , Dragons with parallel P4 chords obtained from every odd cycle Cn (n≥ 9) after removing two edges from the cycle Cn are odd harmonious
    Keywords: Harmonious labelling; Odd harmonious labelling; Cycles with Parallel P3 chords; Joint Sum and Chain of cycles with parallel P3 chords.

  • Fixed point theorems by altering distance technique in complete fuzzy metric spaces   Order a copy of this article
    by Vishal Gupta, Rajesh Kumar Saini, Manu Verma 
    Abstract: The aim of this paper is to define the generalized altering distance function and to extend the Banach contraction principal in complete fuzzy metric spaces using altering distance. The f-mapping also plays an important role to find fixed point. Our result extends the result of Harandi et al. (2013) in fuzzy metric space.
    Keywords: Fuzzy Metric Space; Control Function; Altering Distance.

  • Design alternatives of Euclidian greatest common divisor with enhanced architecture   Order a copy of this article
    by Qasem Abu Al-Haija, Mohammad M. Asad, Ibrahim Marouf, Mahmoud A. Smadi 
    Abstract: In this paper, we proposed different comparable reconfigurable hardware implementations for greatest common divisor (GCD) and least common multiple (LCM) coprocessors using Euclids method and Plus-Minus method with variable datapath sizes. The proposed designs utilized ALTERA Cyclone IV FPGA family with target chip device EP4CGX-22CF19C7 along with Quartus II simulation package. Also, the proposed designs were synthesized and benchmarked in terms of the maximum operational frequency, the total path delay, the total design area and the total thermal power dissipation. Thus, Plus-Minus method proved its enhanced performance by speeding up the operational frequency recoding around 142 MHz of data processing frequency with is as twice as its counterpart for Euclids method while reducing the total path delay by almost 50% compared to Euclids method. However, Euclids method listed less hardware utilization and power dissipation with almost 36% and 10% less than the values for Plus-Minus method respectively. Consequently, Plus-Minus method can be efficiently employed to enhance the speed of computation for many GCD based applications such embedded system designs for public key cryptography.
    Keywords: Number theory; Greatest Common Divisor (GCD); Euclidian GCD; Least Common Multiple (LCM); Plus-Minus GCD; Field programmable gate arrays (FPGA); Integrated circuit synthesis.

  • A cloud broker architecture for cloud service selection based on Multi-criteria Decision Making and Rough Set Theory   Order a copy of this article
    Abstract: Cloud computing is a rising field providing computation resources. It represents a new paradigm of utility computing and enormously growing phenomenon in the present IT industry and economy hype. The companies which provide services to customers are called as cloud service providers. The cloud users (CUs) increase and require secure, reliable and trustworthy cloud service providers (CSPs) from the market. So, its a challenge for a new user to choose the best provider. In this paper, we propose cloud broker architecture to help a new customer to find out the best CP. This architecture is based on a combination of Rough Set Theory and Multi-Criteria Analysis of some parameters related to the quality of service of the available providers. A mathematical model is used to do this analysis; it integrates Multi-Agent Systems to present an intelligent, distributed and collaborative method in order to add assistance to each actor in cloud computing environment.
    Keywords: cloud Computing; Cloud Broker Architecture; Rough Set Theory; Multi-Criteria Decision Analysis.

  • Two Generalized Fixed Point Theorems in $G$-metric Space Without Iterations   Order a copy of this article
    by Saravanan S, Phaneendra T 
    Abstract: Two generalized fixed point theorems are proved using the well-known infimum property of real numbers without an appeal to the iterative procedure.
    Keywords: The Infimum Property; $G$-Metric Space; $G$-Cauchy Sequence; Fixed Point; $G$-contractive Fixed Point.

  • Inverse kinematic Analysis of 5-Axis Hybrid Parallel Kinematic Machine using CAD and Regression analysis approach   Order a copy of this article
    by SURYAM LV, Balakrishna B 
    Abstract: Since three decades for their potentially desirable fast dynamic performance, rigidity, and acceptable accuracy parallel kinematic machines (PKM) attracted interest from industry and academia in the machine tool/robot sectors. PKMs are highly used for their higher accuracy as it relies on system stiffness and rigidity. In PKM, the Inverse kinematic analysis for finding the velocity and acceleration of a limb having more than Two Degree of freedom (DOF) manually is tedious. Also generation of transformation matrix is too complex. In present work, six degrees of freedom 5-axis Hybrid parallel kinematic machine (HPKM) with hemisphere workspace has been modeled and assembled in CATIA. Secondly, inverse kinematic analysis of PKM was carried out in digital mockup unit (DMU), CATIA. The velocities and accelerations of all the three limbs at three different feed rates and variations in joint angles were found. On the other hand, the regression equations were generated for velocity and acceleration of three limbs, joint angles with respect to position and time, while the tool travels along the semi circular contour trajectory
    Keywords: 5-Axis HPKM; Inverse Kinematics; DMU; contour trajectory; Regression analysis.

  • Multi-Response Optimization in CNC turning of Al-6082 T6 using Grey Taguchi Method coupled with Principal Component Analysis   Order a copy of this article
    by Suresh Kumar Gudimetta, P. Venkateshwar Reddy, Mohana Krishnudu Doni 
    Abstract: The present work focuses to analyze the importance of turning parameters on the responses: Machining Time, Surface Roughness and Material Removal Rate in CNC turning while machining of aluminum alloy Al-6082 T6 using tungsten carbide tool. Cutting Speed, feed rate and depth of cut with three levels each have been considered in the current work as the machining parameters. The present study uses the Taguchi's DOE methodology, Grey Relational approach and Principal Component Analysis (PCA) to optimize the response parameters simultaneously. Experiments have been conducted as per Taguchis L9 orthogonal array. The experimental results were then analyzed using the Grey Relational Analysis (GRA) along with the Principal Component Analysis (PCA). The Speed and Feed are observed to be statistically significant on the responses whereas the depth of cut is insignificant. Optimal levels for the parameters are determined using the Grey Taguchi Method and PCA. The confirmation test is carried out and the results are validated.
    Keywords: Multi-response optimization; Turning; Machining Time; Surface Roughness; Material Removal Rate; Grey Taguchi.

  • A New Bio-inspired Algorithm: Lizard Optimization   Order a copy of this article
    by Dharmpal Singh 
    Abstract: A new bio-inspired, Lizard algorithm (LA) is proposed for optimization of soft computing used in data mining. Here, an effort has been made to mimic the anole lizard behaviour to optimize problems of the data set. Furthermore, the experiments have been carried out on five benchmark problems with ten benchmark algorithms like Statistical, Fuzzy, Neural Network, Tabu Search, Simulated Annealing, HS, DEA, PCO, ABC and ACO on data set. The concept of average error and residual error were conducted to compare the performance of LA with that of other used algorithms. The concepts of residual analysis and Chi test (χ2) have also been performed on the proposed algorithm to check the righteous among the algorithms. And the result has shown that LA has achieved good optimization results in terms of both optimization accuracy and robustness.
    Keywords: Data mining; association rule; fuzzy logic; neural network; particle swarm optimization; artificial bee colony algorithm; ACO; TS; SA; DEA and harmony search algorithm Lizard algorithm.

  • The Rapid Development of Knowledge Bases Using UML Class Diagrams   Order a copy of this article
    by Aleksandr Yurin, Nikita Dorodnykh, Olga Nikolaychuk 
    Abstract: The knowledge bases design that employs conceptual models and their transformations is one of the most interesting areas of knowledge engineering and it remains critical. This paper presents an approach to the rapid development of knowledge bases for rule-based expert systems on the basis of the model-based generation of program codes. The approach proposed uses conceptual models in the form of UML class diagrams as a source of knowledge. However, the main principles of our approach can be used to analyze others models, for instance, OWL or block definition diagrams of SysML. In accordance with the technique that we propose, diagram concepts and relationships are extracted, mapped to the ontology of a subject domain and represented in the form of fact templates and logical rules. The original notation, namely the Rule Visual Modelling Language (RVML), is used to visualize and modify mapped elements. The C Language Integrated Production System (CLIPS) is used as the targeted knowledge base programming language. The new approach does not eliminate errors due to inaccurate or incompletely analyzed conceptual models; but provides minimization of programming errors resulting from hand coding. The algorithms describing model transformations are implemented as a software prototype. The case study that demonstrates the principal applicability of the approach and software was conducted. Moreover, the approach is used to design the knowledge base of a decision-support system for the industrial safety expertise of petrochemical facilities.
    Keywords: knowledge base; knowledge acquisition; code generation; UML class diagram; CLIPS.

  • Quantifying wind-driven rain on a heritage facade through computational fluid dynamics.   Order a copy of this article
    by Sat Ghosh, Yash Dugar, Namrata Kakoti, Chirag Shah 
    Abstract: The Chennai Central Station is a heritage structure and a stellar landmark its Neo-Gothic Visage is an iconic structure in the heart of this thriving metropolis. It has withstood the ravages of nature battering of gale force winds during tropical storms, unacceptable levels of particulate pollution, and the onslaught of driving rain. This 142 year old building can survive much longer combating further natural and man-made impacts if the existing fa
    Keywords: Lagrangian Integral Time Scales; Computational Fluid Dynamics; Turbulent Trajectories; Heritage Facades.

  • A hybrid approach of firefly and genetic algorithm for solving optimization problems   Order a copy of this article
    by Fazli Wahid, Rozaida Ghazali 
    Abstract: Firefly algorithm (FA) is a newly developed nature-inspired, meta-heuristic, stochastic algorithm that has seen many applications in solving problems of optimization nature since its introduction just a couple of years ago. FA is a simple, flexible, easily implementable and robust approach inspired from natural phenomenon of light emission by fireflies but a major drawback associated with FA is the random initial solution set generation. This random initial solution set leads to imbalanced relation between exploration and exploitation property that results in slow local and global convergence rates that ultimately degrade the solution quality. This issue can be resolved by providing some organized initial solution set instead of randomly generated solution search space which will balance the exploration and exploitation capability of FA during initial solution set generation stage. In this work, the targeted issue has been resolved by introducing genetic algorithm (GA) operators namely selection, mutation and cross over operators during initial solution set generation for standard FA. The proposed technique has been applied to few standard benchmark minimization and maximization functions and the results have been compared with standard FA and GA. A significant amount of improvement in the convergence rate can be observed that results in high quality solution for solving optimization problems.
    Keywords: Standard Firefly Algorithm; Genetic algorithm; Random solution generation; Hybrid GA-FA; Faster Convergence.

  • Multi Objective Optimization of Tube Hydroforming Process on IF Steel using Taguchi based Principal Component Analysis   Order a copy of this article
    by P. Venkateshwar Reddy, B. Veerabhadra Reddy, P. Janaki Ramulu 
    Abstract: Tube Hydroforming is one of the eccentric metal shaping procedures, especially utilized as a part of the automotive industry. This Process has accentuated much consideration inferable from its diversified applications. To emphasize, aerospace and automotive industries depend much upon this procedure. Earlier most of the researchers worked on the material characterization and the process parametric impacts of Tube Hydroforming process. However, very few have attempted on the multi-objective optimization of the critical process parameters. In this work, commercial Finite Element code Pam-Stamp 2G is utilized to do the simulations in view of the L27 orthogonal array. Hence a multi-objective optimization was studied that simultaneously maximizes the bulge ratio and minimizes the thinning ratio by Taguchi based Principal Component Analysis (PCA), which is the novelty of this work. The present work aims at exploring the impact of geometrical, process and material parameters on aforementioned responses using analysis of variance (ANOVA).
    Keywords: Multi-response optimization; Tube Hydroforming; FEM; IF-Steel; Taguchi Method; PCA.

  • The graph SSG(2) is odd graceful and odd harmonious   Order a copy of this article
    by J. Jeba Jesintha , K. Ezhilarasi Hilda Stanley 
    Abstract: A subdivided shell graph is obtained by subdividing the edges in the path of the shell graph . Let G1, G2 , G3,. . ., Gn be 'n' subdivided shell graphs of any order. The graph SSG(n) is obtained by adding an edge to apexes of Gi and Gi+1, i = 1, 2, . . . ,(n-1). The graph SSG(n) is called a path union of 'n' subdivided shell graphs of any order. In this paper we prove that the subdivided shell graph is odd harmonious. We also prove that SSG(n) is odd graceful and odd harmonious when n = 2.rn
    Keywords: Odd graceful labeling; odd harmonious labeling; shell graph; subdivided shell graph; the graph SSG(2).

  • Analysis, Identification and Design of Robust Control Techniques for Ultra-Lift Luo DC-DC Converter Powered by Fuel Cell   Order a copy of this article
    by Rajanand Patnaik Narasipuram 
    Abstract: In recent years the development of alternative energy source is getting more importance because of high quality characteristics in power generation system with low cost. The proliferation of DC-DC converter is widely used for high power and low voltage applications. In such cases, the Ultra-Lift Luo converter is the suitable one compared to Voltage Lift (VL) Luo-Converter and Super-Lift (SL) Luo-Converters because of their limitations in voltage transfer gain and more voltage ripples. This paper proposes the Proportional Integral (PI) and Fuzzy Logic (FL) controllers for the fast and efficient operation of Ultra-Lift (UL) Luo-converter. An Air Breathing Fuel Cell (ABFC) stack is developed and which is taken as an input power source to the Ultra-Lift Luo-converter. This paper will give focus on design, mathematical analysis and operation of fuel cell powered DC-DC converter is incorporated with PI and FL controllers which are simulated using MATLAB/Simulink software. Furthermore, simulation results of ABFC are also presented to review the effects of temperature from 30oC-70oC in fuel cell voltage. The dynamic response of PI and FL controllers are sifted for line as well as load voltage regulations and the simulation results are presented using MATLAB/Simulink.
    Keywords: Air Breathe Fuel Cell; ABFC; Fuel Cell; Stack; DC-DC converter; Proportional Integral controller; PI; Fuzzy Logic Controller; FL; Voltage Lift; VL; Super-Lift; SL; Ultra-Lift Luo-converter; UL;.

  • Simple and Effective Control and Optimization of a Wind Turbine Based on a DFIG   Order a copy of this article
    by Anis Tarfaya, Djalel Dib, Mehdi Ouada, Sihem Ghoudelbourk 
    Abstract: The objective of main contribution of this paper a satisfactory result control, optimization of a wind turbine based on a Doubly Fed Induction Generator (DFIG) , this system contain 1.5 MW wind turbine, gearbox , DC /AC inverter numerical simulation implementation of this model under Matlab /Simulink are reached using different toolbox, a direct vector control with direct field oriented control (FOC) applied and two maximum power point tracking control (MPPT) ,with and without speed control , for improving system performance under fast changing of wind speed conditions , the simulation results quickly and correctly track change in power set point, the disadvantage of classical FOC based on PI controller is eliminated by using Sliding mode controller, the result of simulation shows good performance of used method .

  • Spur Gear Safety Prediction through Analysis of Stress Intensity Factor   Order a copy of this article
    by Haidar AL-Qrimli 
    Abstract: Application of gear in heavy industry is extremely challenging with the exposure to high load and high speed during transmission. These exposures produce an unfavorable condition in the gear operation as they induce crack. Induced crack might not only causes machine failures but catastrophic incidents that cost lives. Provided that the crack does not grow towards the shaft, the gear might still functioning but at a lower efficiency. Vice versa, the gear will tear into parts and causes catastrophic damages. Therefore, the study of crack propagation pathway was conducted by analyzing the crack tip behavior. The crack tip behavior was indicated using the stress intensity factor (SIF) by identifying the potential fracture mode of the crack gear model. The analysis had implemented the application of extended finite element method (XFEM) in ABAQUS to avoid the need of re-meshing as in finite element method (FEM). The simulation outcomes show that the cracked gear model is experiencing a significant compressive in-plane shear than tensile stress. It also allows the witnesses of crack propagation along the tooth.
    Keywords: Gear; Crack; Safety; Contact Stress; XFEM.
    DOI: 10.1504/IJCAET.2021.10017169
  • Development of smoke prediction model using numerical simulation   Order a copy of this article
    by Nurud Suhaimi, Abdullah Ibrahim, Saari Mustapha, Nor Mariah Adam, Rafee Baharuddin 
    Abstract: Conducting a full-scale fire test is the best way to acquire information regarding to smoke and fire condition for the purpose of fire safety protection and prevention installation. However, full scale fire experiment was expensive, time consuming and unrepeatable due to unstable nature of fire. The main goal of this study was to develop a quantitative tool to analyze the smoke temperature in a stairway by utilizing Fire Dynamic Simulator software. A result of this study provides mathematical correlation that could predict temperatures in adjacent vertical shaft used as stairway which is useful in performance based design. The smoke temperature predicted using developed model compared with numerical data showed error less than 30% were considered satisfactory and the equation considered acceptable for this study. rnrn
    Keywords: quantitative tool; smoke temperature; vertical shaft; FDS; mathematical correlationrnrnrn.

  • Optimization of Nozzle Diameter of Nebulizer for Salbutamol   Order a copy of this article
    by Vinoth N, Lokavarapu Bhaskara Rao 
    Abstract: In medicine, a nebulizer is equipment used to achieve medication in the form of spray breathe into lungs. The general methodological principle of nebulizers is to use oxygen or compressed air or ultrasonic powder, as means to breakup medicinal solutions into smaller mister dews, for direct breath from the mouthpiece of the equipment. The utmost normally used nebulizers are Jet nebulizer, which is also known as "atomizer". Jet nebulizers are associated with compressor that supports compressed air to flow at maximum velocity over a liquefied medication to turn into a mister, which is then breathed in by the patient. The objective of this study is to optimize the jet nebulizer in order to deliver medicine in minor sufficient dews to attain a satisfactory effectiveness of medicated aerosol to reach lungs. This study concentrates on design optimization of the nozzle by analysis of flow parameters, evaluation of the path of the fluid flow and determination of nozzle diameter and flow pressure.
    Keywords: Drug delivery; Nebulizer; Atomizer.

  • TwoWay Distributed Sequential Pattern Mining using Fruitfly Algorithm along with Hadoop and Map Reduce Frame Work   Order a copy of this article
    by Vankudothu Malsoru, Naseer A. R., Narsimha G 
    Abstract: Data Mining is an effectual tool used to take out information from a big data as it provides several benefits to conquer the restrictions in it. In this paper, we present an innovative procedure developed using Updown Directed Acyclic Graph (UDDAG) with Fruit Fly Optimization algorithm, which is based on Sequential Pattern mining algorithm. In this work, the distributed sequential model mining algorithm is used to diminish the scanning time and scalability and the transferred database is employed to optimize the memory storage. The proposed method is used to expand the sequences in both the ends (prefixes and suffixes) of identified model thereby supplying the consequences in quicker model expansion resulting in fewer database projections when compared to conventional methods. Our proposed method is implemented in Hadoop distributed surroundings to resolve the scalability issues and executed on JAVA platform using big datasets with Hadoop and Map-reduce frame work.
    Keywords: Data Mining; Updown Directed Acyclic Graph; Fruit-fly algorithm; Distributed Sequential model mining; Hadoop with Map-Reduce framework.

  • Minimum layout of hypercubes and folded hypercubes into the prism over caterpillars   Order a copy of this article
    by Jessie Abraham, Micheal Arockiaraj 
    Abstract: The problem of embedding an n-node graph G into an n-node graph H is an important aspect in parallel and distributed processing. Graph embedding results have been successfully used to establish equivalence of interconnections in parallel and distributed machines. The binary hypercube is one of the most widely used multiprocessor systems due to its simple, deadlock-free routing and broadcasting properties. The folded hypercube is an important variant of hypercube with the same number of nodes. Trees are the fundamental graph theoretical models in many fields including artificial intelligence and various network designs. In this paper we consider the problem of embedding the hypercube and folded hypercube into the prism over a caterpillar in such a way as to minimize its layout.
    Keywords: embedding; folded hypercube; prism over a graph; layout.

  • Decidability of Compatibility for Data-Centric Web Services   Order a copy of this article
    by Mohamed Said Mehdi Mendjel, Hassina Seridi-Bouchelaghem 
    Abstract: The problem of checking compatibility of data-centric services is discussed herein. It focuses, more specifically, on compatibility of data-centric services' protocols such that a service protocol is the description of the service's external behaviour. Our study comprises two parts: The first part consists of checking the services' protocols compatibility, which is represented by the same database instance with different queries. Here, we prove that the verification problem is decidable; the second part consists of studying the same problem but including a database. Hence, the problem of data infinity, which remains undecidable although the use of classic verification algorithms. This study is concluded by the implementation of a verification tool based specifically on guarded services with a finite database.
    Keywords: Artifacts; Business Protocols; Data-Centric Services; Infinity of Data; Compatibility of Web Services.

  • Latency- Optimized 3D Multi-FPGA System with Serial Optical Interface   Order a copy of this article
    by Asmeen Kashif, Mohammad Khalid 
    Abstract: Multi-FPGA Systems (MFSs) are capable of prototyping large SoCs. However, planar 2D MFSs with electrical interconnections have broader spatial distribution and large off-chip delays. One good solution for this problem is to use a three-dimensional (3D) architecture, where multiple FPGAs are stacked on top of each other rather than being spread across a 2D plane.This provides lower off-chip latency with smaller footprint. 3D MFS performance can be further improved through reduction in the number of interconnects by employing serial communication. Nevertheless, electrical interconnects are limited in their performance due to latency. Replacing electrical interconnections by optical interface further reduces off-chip delays. Additionally, the selection of MFS routing architecture also has substantial effect on system performance. In this paper, we propose novel 3D MFSs with different routing architectures that employ serialized optical interface improving system frequency significantly. An experimental architecture evaluation framework and associated CAD tools were developed. The proposed architectures were experimentally evaluated and provided average system frequency gain of 37% across six benchmark circuits.
    Keywords: 3D Multi-FPGA; optical interface; multiplexing; routing architectures.

    by Ummadi Janardhan 
    Abstract: The Novel Local Energy-based Shape Histogram (LESH) feature mining strategy was proposed for different cancer predictions. This paper stretches out unique work to apply the LESH system to distinguish lung cancer using machine learning approaches. As the traditional neural network systems are complex and time consuming, machine learning approaches are considered in this work, which atomizes the process of tumor identification. Before LESH feature extraction, we upgraded the radiograph pictures utilizing a complexity constrained versatile histogram adjustment approach. Subjective machine learning classifiers are chosen specifically extraordinary learning machine approach, Support vector machine (SVM) connected utilizing the LESH impassive features for effective analysis of right therapeutic state in the x-ray and MRI pictures. The framework comprises of feature extraction stage, including choice stage and order stage. For including extraction/choice distinctive wavelets capacities have been connected to locate the noteworthy exactness. Grouping K-nearest neighbor calculation has been created/used for arrangement. The informational collection used in the proposed work has 114 knob regions and 73 non-knob districts. Precision levels of more than 96% for characterization that have been accomplished which exhibit the benefits of the proposed approach.
    Keywords: LESH; Feature Extraction; Cancer Cell; K-Nearest Neighbor; Classification;.

  • Fuzzy predictor for parallel dynamic task allocation in multi-robot systems   Order a copy of this article
    by Teggar Hamza, Senouci Mohamed, Fatima Debbat 
    Abstract: This paper presents a model to decompose complex tasks in the form of elemental tasks executed in parallel by multi robots. In this model, a criterion of accuracy in the parallel dynamic tasks allocation process (APDTA) is defined. Through this APDTA, a predictor based on fuzzy logic called FP-TE is developed to evaluate the importance of elemental tasks in the system. The inputs of this predictor are described as observations acquired from sensor data. The FP-TE output will be used to allow each robot to individually decide what task should be executed. Simulation results on goods transportation by mobile robots are presented to demonstrate the effectiveness of this fuzzy predictor.
    Keywords: dynamic tasks allocation; multi-robot systems; fuzzy predictor; accuracy in tasks allocation; distributed MRS.

  • Prediction of Wine Quality and Protein Synthesis Using ARSkNN   Order a copy of this article
    by Ashish Kumar, Roheet Bhatnagar, Sumit Srivastava 
    Abstract: The amount of data available and information flow over the past few decades has grown manifold and will only increase exponentially. The ability to harvest and manipulate information from this data has become a crucial activity for effective and faster development. Multiple algorithms and approaches have been developed in order to harvest information from this data. These algorithms have different approaches and therefore result in varied outputs in terms of performance and interpretation. Due to their functionality, different algorithms perform differently on different datasets. In order to compare the effectiveness of these algorithms, they are run on different datasets under a given set of fixed restrictions e.g. hardware platform etc. This paper is an in depth analysis of different algorithms based on trivial classifier algorithm, kNN and the newly developed ARSkNN. The algorithms were executed on three different datasets and analysis was done by evaluating their performance taking into consideration the Accuracy percentage and Execution Time as performance measures.
    Keywords: Classification; Nearest Neighbors; ARSkNN; Similarity Measure;.

Special Issue on: ICMCE-2015 Advances in Applied Mathematics

  • Robust optimal sub-band wavelet cepstral coefficient method for speech recognition   Order a copy of this article
    by John Sahaya Rani Alex, Nithya Venkatesan 
    Abstract: The objective of this paper is to propose a robust feature extraction technique for speech recognition system which is insusceptible in the adverse environments. Efficacy of the speech recognition system depends on the feature extraction method. This paper proposes an auditory scale like filter banks using optimal sub-band tree structuring based on wavelet transform. The optimized wavelet filter banks along with energy, logarithmic, discrete cosine transform and cepstral mean normalisation blocks form a robust feature extraction method. This method is validated on a Hidden Markov Model (HMM) based single Gaussian isolated word recognition system for additive white Gaussian noise, street and airport noises with different noise levels. Compared with Fourier transform based methods such as Mel-Frequency Cepstral Coefficient (MFCC) and Perceptual Linear Predictive (PLP) methods, the wavelet transform based method yielded significant improvement across all the noise levels. The experiments also performed with higher dimensions of MFCC features including delta, acceleration features (MFCC_D_A). This study proves that the outcome of wavelet transform based method give an increased recognition accuracy of 13% over MFCC_D_A for non-stationary noises.
    Keywords: Speech recognition; feature extraction; wavelet transform; robust; noisy environments; MFCC; PLP.

  • An efficient Hosoya Index Algorithm and its application   Order a copy of this article
    by Paul Devasahayam Manuel 
    Abstract: The number of matchings in a graph is known as the Hosoya index of the graph. The problem of computing Hosoya index is #P-complete. If the adjacent edges are sequentially ordered, then we show that a polynomial algorithm can be designed. The significance of this algorithm is demonstrated by computing Hosoya index for certain chemical compounds such as Pyroxene. This algorithm can be applied to grid like chemical compounds such as sodium chloride, carbon nanotubes, Naphtalenic Nanotube etc.
    Keywords: Hosoya Index; cheminformatics; carbon nanotubes; #P-complete.

Special Issue on: Future Directions in Computer-Aided Engineering and Technology

  • Improved Indoor Location Tracking System for Mobile Nodes   Order a copy of this article
    by SUNDAR S, KUMAR R, Harish M.Kittur 
    Abstract: The solutions to the problem of the tracking a wireless node is approached conventionally by (i) Proximity Detection, (ii) Triangulation and (iii) Scene Analysis methods. In these, scene analysis method is simple, accurate and less expensive. Indoor localization technologies need to address the existing inaccuracy and inadequacy of Global Positioning based Systems (GPS) in indoor environments (such as urban canyons, inside large buildings, etc.).This paper presents a novel indoor Wi-Fi tracking system with minimal error in the presence of barrier using Bayesian inference method. The System integrates an Android App and python scripts (that run on server) to identify the position of the Mobile node within an indoor environment. The received signal strength indicator (RSSI) method is used for tracking. Experimental results presented to illustrate the performance of the system comparing with other methods. From the tracked nodes, a theoretical solution is proposed for finding shortest path using Steiner nodes.
    Keywords: Location Tracking; GPS; MANETs; Mobile nodes; Wi-Fi Access points; WLAN; Bayesian Inference; RSSI; Shortest paths; Steiner nodes.
    DOI: 10.1504/IJCAET.2020.10011974
  • Bi-level User Authentication for Enriching Legitimates and Eradicating Duplicates (EnEra) in Cloud Infrastructure   Order a copy of this article
    by Thandeeswaran R, Saleem Durai M A 
    Abstract: Ease of usage of Cloud computing leads to an exponential growth in all sectors. Exponential growth always attracts duplicates to consume and deplete resources. Cloud is not exempted from invaders and overwhelming the resource utilization thereby availability become a threat. Availability issue arises due to multiple requests towards the same victim, a DDoS attack. Hence, the major concern in the cloud is to rightly identify legitimates, and providing the required services all time go by avoiding DDoS attacks. Multiple techniques are available to identify and authenticate the users. This paper not only just try to authenticate the users but also works on eliminating the invaders in two fold. In the first phase, the user ID is scrambled in four different steps. In the second phase, the users are authenticated depending on the credits. Based on the traffic flow (in the case of network level attack) and on the interval between consequent service requests (in the case of service level attack), users are authenticated upon which services are provisioned accordingly. The simulation results presented here exhibits the strength of the proposed method in detection and prevention of DDoS attack in cloud computing environment.
    Keywords: DDoS attack; SSID; Authentication; credits; cloud environment; legitimate; attackers.

  • Hybrid Algorithm for Twin Image Removal in Optical Scanning Holography   Order a copy of this article
    by P. Bhuvaneswari Samuel, A. Brintha Therese 
    Abstract: Optical Scanning Holography is an incoherent optical image processing system. It is a technique, where the complete information of an object or image will be recorded as a hologram and later reconstructed to get back the original image. In the hologram reconstruction process, a virtual image is formed along with the real image, which appears as a twin image noise. To eliminate such noises, a technique of Hybrid Algorithm is used while recording the hologram itself. Hybrid algorithm is derived from the combination of conventional Optical Transfer Function (OTF) used in existing method and the proposed OTF obtained by varying the spatial frequency and arrived to an optimal spatial frequency which imparts good quality of image. Various images are tested with the Hybrid Algorithm. The Matlab R2012b image processing tool is used for simulation and the simulated values are tabulated and compared with the existing method in terms of Peak Signal to Noise Ratio, Mean Square Error. In the reconstruction, the proposed method results are having 26% increment in the MSE and PSNR values. To further improve the MSE and PSNR values a case study using different denoising techniques combined with the proposed hybrid algorithm is used and found considerable improvement of 32%. Hence the image quality is increased.
    Keywords: Optical Scanning Holography; Fresnel Zone plate; OTF ; spatial frequency; twin image noise; denoising.
    DOI: 10.1504/IJCAET.2020.10010579
  • Evaluation of Video Watermarking Algorithms on Mobile Device   Order a copy of this article
    by Venugopala P S 
    Abstract: Advancement of Internet services and design of image and video capturing devices along with various storage technologies has made video piracy an issue. Asserting the originality of digital data and having a copyright on the file is always a challenging task. Digital watermarking is a technique of embedding secret information known as watermarks within image or video file. This can be used for authentication and ownership verification purposes. This paper presents an analysis of mobile deployment of various video watermarking algorithms. The analysis is carrie dout using the quality parameters like PSNR, time of execution and power consumption. The goal of video watermarking method, implemented on a mobile phone, is to enhance security and to achieve copyright protection for the video files that are captured using a mobile phone. The video file is applied with three different watermarking methods, DCT, LSB and Bit stream. These methods are compared for their performance using the parameters PSNR, power consumed and time of execution. It is observed that, the proposed Bitstream method gives better performance compared to other methods for these parameters.
    Keywords: DCT; LSB; Bit stream; Watermarking; Copyright protection.

  • Automatic Identification of Acute Arthritis from Ayurvedic Wrist Pulses   Order a copy of this article
    by Arunkumar N, Mohamed Shakeel P, Venkatraman V 
    Abstract: Traditional ayurvedic doctors examine the state of the body by analyzing the wrist pulse from the patient. Mysteriously the characteristics of the pulses vary corresponding to the various changes in the body. The three pulses acquired from the wrist are named as Vata, Pitta and Kapha. Ayurveda says that when there is imbalance in these three doshas, one will have disease. Two different diseases will have different patterns in their pulse characteristics. Thus the wrist pulse signal serves as a tool to analyze the health status of a patient. In the earlier work, we have standardized the signals for normal persons and then classified the diabetic cases using approximate entropy (ApEn) [10] and later enhanced the results using sample entropy. In the present work, sample entropy (SampEn) is being used to classify for the acute arthritis cases.
    Keywords: Vata; Pitta; Kapha; Approximate Entropy(ApEn); Sample Entropy (SamPEn).

  • A Real-Time Auto Calibration Technique for Stereo Camera   Order a copy of this article
    by Hamdi Boukamcha, Fethi Smach, Mohamed Atri 
    Abstract: Calibration of the internal and external parameters of a stereo vision camera is a well-known research problem in the computer vision. Usually, to get accurate 3D results the camera should be manually calibrated accurately as well. This paper proposes a robust approach to Auto Calibration stereo camera Without intervention of the user. There are several methods and techniques of calibration that have been proven, in this work we exploiting the geometric constraint, namely, the epipolar geometry. We specifically focus to use 7 techniques for Features Extraction (SURF, BRISK, FAST, FREAK, MinEigen, MSERF, SIFT), however tries to establish the correspondences between points extracted in stereo images with Various Matching Techniques (SSD, SAD, Hamming).Then we exploit the Fundamental Matrix to estimate the epipolar Line by choosing the perfect Eight-point algorithms (Norm8Point, LMedS, RANSAC, MSAC, LTS). rnA large number of experiments have been carried out, and very good results have been obtained by Comparison &choice the perfect technique in every stage.rn
    Keywords: Auto calibration; Robust matching; Epipolar geometry; Fundamental matrix; Matching Technique.
    DOI: 10.1504/IJCAET.2020.10016061
  • Improved automatic age estimation algorithm using a hybrid feature selection   Order a copy of this article
    by Santhosh Kumar G, Suresh H. N 
    Abstract: Age estimation (AE) is one of the significant biometric behaviors for emphasizing the identity authentication. In facial image, Automatic-AE is an actively researched topic, which is also an important but challenging study in the field of face recognition. This paper explores several algorithms utilized to improve AE and the combination of features and classifiers are associated. Initially, the facial image databases are trained and then the features are extracted by employing several algorithms like Histogram of Oriented Gradients (HOG), Binary Robust Invariant Scalable Keypoints (BRISK), and Local Binary Pattern (LBP). Here, the AE is done in three various age groups from 20 to 30, 31 to 50 and above 50. The age groups are classified by utilizing Na
    Keywords: Age estimation; BRISK; HOG; LBP; NBC.

Special Issue on: Computer-Aided Intelligent Systems

  • Optimized RBIDS: Detection and Avoidance of Black Hole Attack through NTN Communication in Mobile Ad-hoc Networks   Order a copy of this article
    by Gayathri VM, Supraja Reddy 
    Abstract: A Mobile Ad-hoc Networks which is an emerging technology in various fields of computer science along with Sensor Applications providing a big credential to the people via smart innovation. In this network, each node connected on requirement basis. Since it is infrastructure-less any node can come into the network topology and participate in the packet transmission. Each and every node in the network topology join based on their sequence number, distance, RF based calculation. Any node satisfies these requirements can involve in transferring packets as a router or intermediate nodes. It becomes an open door for the attackers to enter into the network which results more vulnerable state. In this paper, we are concern about the black hole attack in the network which results in dropping of packets where originally it has to send to the destination. It happens because of false identity of the node. Implementation using NS2 simulator on demand protocol namely AODV. An algorithm is proposed to improvise the network performance by detecting the malicious node called RBIDS. This algorithm applies on every individual node over a period of time to calculate their performance based on regression values.
    Keywords: NTN;RBIDS;AODV;Regression.

  • A new parallel DSP hardware compatible algorithm for noise reduction and contrast enhancement in video sequence using Zynq-7020   Order a copy of this article
    by MADHURA S, Suresha K 
    Abstract: Various video processing applications such as liquid crystal display processing, high quality video photography, terrestrial video recording and medical imaging systems requires robust noise reduction and contrast enhancement technique which provides visually pleasing For real time implementation novel hardware architecture has been designed using Look-Up-Table (LUT) acceleration approach which helps achieve high speed processing. Until now a lot of researchers have worked on noise removal and contrast enhancement of digital videos, but the developed algorithms works with only some verity of noises and failed to produce desirable results for various types of distortions and real-time implementation is still remaining a challenge. Hence appropriate filter needs to be designed which address both kinds of errors. In this paper adaptive trilateral filter has been designed for noise reduction, the results are measured using qualitative and quantitative analysis which has aided in better utilization of hardware for real-time implementation. The experimental results show that the proposed algorithm provides a frame rate of 40 fps on an average and has a resolution of 720x576. The proposed algorithm was implemented on ZedBoard Znyq-7020 development kit by Xilinx.
    Keywords: video enhancement; segmentation; trilateral filtering; real-time implementation; Znyq-7020.

  • HDFS Based Parallel and Scalable Pattern Mining Using Clouds for Incremental Data   Order a copy of this article
    by Sountharrajan S., Suganya E, Aravindhraj N, Rajan C 
    Abstract: Increased usage of Internet led to the migration of large amount of data to the cloud environment which uses Hadoop and Map Reduce framework for managing various mining applications in distributed environment. Earlier research activity in distributed mining comprises of solving complex problems using distributed computational techniques and new algorithmic designs. But as the nature of the data and user requirement becomes more complex and demanding, the existing distributed algorithms fails in multiple aspects. In our work, a new distributed frequent pattern algorithm, named Hadoop based Parallel Frequent Pattern mining (HPFP) has been proposed to optimally utilize the clusters efficiently and mine repeated patterns from large databases very effectively. The empirical evaluation shows that HPFP algorithm improves the performance of mining operation by increasing the level of parallelism and execution efficacy. HPFP achieves complete parallelism and delivers superior performance to become an efficient algorithm in HDFS, than existing distributed pattern mining algorithms.
    Keywords: Cloud Computing; Hadoop Distributed File System; Map Reduce; Association Rules; Frequent Pattern Growth Algorithm; Distributed Mining; Parallel Pattern Mining.

    by Muthukumaresan Mb, Sakthivel S 
    Abstract: Most of the military organization now takes the help of robots to carry out many risky jobs that cannot be done by the soldier. These robots used in military are usually employed with the integrated system, Including video screens, sensors, gripper and cameras. The military robots also have different shapes according to the purposes of each Robot. Here the new system is proposed with the help of low power Zigbee wireless sensor network to trace out the intruders (unknown persons) an d the robot will take the necessary action automatically. Thus the proposed system, an Intelligent Unmanned Robot (IUR) Using Zigbee saves human live and reduces manual error in defense side. This is specially designed robotic system to save human life and protect the country from enemies.
    Keywords: Microcontroller; ZIGBEE module; IUR robot.

  • An Efficient Packet Image Transmission based on Texture Content for Border side Security Using Sensor Networks   Order a copy of this article
    by Pitchai Ramu, Reshma Gulsar, Raja Jayamani 
    Abstract: In the field of surveillance, several algorithms are developed to extract meaningful information from an image captured via a camera. In the presence of intrusion event, these cameras will transmit those captured images to the sink node via other intermediate nodes. Since, WSNs operate with limited resources, efficient utilization of resource is needed while processing and transporting images. Since the node does not need whole image data are mandatory. Prioritization is one of the methods to utilize the available resource. It will prioritize images from its macro-blocks dynamically. Here the camera is attached in a sensor node which forms Wireless Multimedia Sensor Networks (WMSN). Its employs an encoding scheme at the source node by naming the blocks as important or not-important based on the information they contain. Here image texture feature and spectral information is used as priority measures to weight importance of macro-blocks using their textural GLCM properties. Experimental results disclose that the priority encoding scheme adapts itself to the applications quality requirements while reducing the required bandwidth comparatively.
    Keywords: Wireless Sensor Network; Prioritization; Wireless Multimedia Sensor Networks; Texture; GLCM; Macro Block.

  • Deep Learning based Techniques to Enhance the Precision of Phrase-Based Statistical Machine Translation System for Indian Languages   Order a copy of this article
    by Sanjanasri JP, Anand Kumar M, Soman KP 
    Abstract: The paper focuses on improving the existing Phrase-Based Statistical Machine Translation (PB-SMT) system by integrating deep learning knowledge to it. In this paper, a deep learning based PB-SMT system for Indian languages is developed, so as to improve the conditional probability of the phrase-table and replaced the neural probabilistic language model with the existing back off algorithm of n-gram language model to improve the performance of language model. It is shown that the deep feature based PB-SMT is better than the standard PB-SMT system. It is shown the significance of integrating manually created dictionaries that has been trained as separate translational model can enhance the result of statistical machine translation system when decoding. For automatic evaluation, it is shown that RIBES being a better evaluation metric for Indian languages compared to BLEU, a standard one.
    Keywords: Indian Languages; Phrase-based Statistical Machine Translation (PB-SMT); Neural Probabilistic Language Model (NPLM); Deep Belief Network (DBN); Pruning; Minimum Error Rate Training (MERT); Bilingual Evaluation Understudy (BLEU); Rank-based Intuitive Bilingual Evaluation Score (RIBES).

  • Enhancing Performance Of WSN By Utilizing Secure Qos Based Explicit Routing   Order a copy of this article
    by Kantharaju HC 
    Abstract: Wireless Sensor Networks (WSN) are infrastructure less and self-configured wireless networks that allows monitoring the physical conditions of an environment. Many researchers focus on enhancing the performance of WSN in order to provide effective delivery of data on the network, but still results in lower quality of services like data transmission time, energy consumption, delay and routing. We tackle this problem by introducing a new routing algorithm, QoS based Explicit Routing Algorithm which helps in transmitting the data from source node to destination node on WSN. We also involve clustering process in WSN based on GA and PSO algorithm (Genetic Algorithm and Particle Swarm Optimization) and followed by cluster head selection process which is more important on the routing process. Secure communication is the most important need for WSN, for that we propose IBDS (Identity based Digital Signature) and EIBDS (Enhanced Identity based Digital Signature) that involves reduction of computation overhead and also increasing resilience on the WSN. We also use AES (Advanced Encryption Standard), for ensuring the security between nodes and avoid hacking of data by other intruders. This process is done on base station, sensor nodes and cell coordinator nodes. Thus our proposed framework is effective by increasing the lifetime of nodes, improving secure communication between nodes.
    Keywords: Wireless Sensor Network; Cryptography; Digital Signature; Quality of Service.

  • Hybrid Data Model Of PACE and Quadruple: An Efficient Data Model for Cloud Computing   Order a copy of this article
    by CLARA KANMANI, Dr Suma V. , Guruprasad N 
    Abstract: Cloud computing is a promising computing paradigm that involves outsourcing of computing resources with the capabilities of expendable resource scalability, on-demand provisioning with little or no up-front IT infrastructure investment costs. The semantic web is an extension of the web through standards by the World Wide Web consortium (W3C). Resource Description Framework (RDF) is the semantic data model for cloud computing which provides interoperability but is not effective in terms of scalability, formal semantics and query optimization and reification. One of the challenges in cloud computing therefore is to enhance RDF data model which is achievable by addressing the current weakness of RDF reification mechanism.This paper hence put forth a comprehensive overview of challenges in RDF Reification. Further, the paper introduces a data model which uses hybrid approach of Provenance aware context entity(PACE) and quadruples method of reification. This hybrid RDF data model is deployed and tested for its performance on the AWS public cloud. Experimental results indicate that the proposed hybrid data model enhances accessibility, maintainability, and also accelerates query execution time.
    Keywords: Cloud computing; Semantic web; Resource description framework; Data model; PACE; Quadruple.

  • A Semi-Automated System for Smart Harvesting Of Tea Leaves   Order a copy of this article
    by Manesh Murthi, Senthil Kumar Thangavel 
    Abstract: Tea leaf cultivation is a major part of livelihood in hill station like Nilgiris. The conventional method of tea leaf plucking is done manually with a knife. Harvesting machines have also been designed that could quickly This gives better result in manpower who has better experience and knowledge about terrains. The paper has proposed a semi-automatic working model that has an arm that can move around and pluck the leaves. A complete preprocessing phase has been done using key frame extraction, rice counting, optical flow with noise model by the author in an earlier paper. This process is improved by using Active contour with optical flow algorithm that minimizes the region on which the tea leaf detection algorithm is applied. The second phase of the paper also suggests how deep learning approach can also be used for improving the performance of the proposed work. The proposed work is novel because it has capabilities of considering motion with keyframe capabilities and the noise model using deep learning. The proposed work has experimented with parameters like precision, recall, FAR, FRR to evaluate the nature of misclassifications.
    Keywords: Video analytics; Noise model Keyframe; Raspberry Pi; Arduino due; optical flow; rice counting; segmentation.

    Abstract: The most popular renewable energy technology is Hybrid Power System consisting of wind and solar energy sources because the system is reliable and complimentary in nature. Wind / PV Hybrid system is commonly used in Distributed Generation (DG). This paper proposes a new solution for improved voltage stability with quality power output. In this system voltage out from wind energy conversion system(WECS) and Photo voltaic panel are given to separate DC - DC converters, independently controlled and connected to a common D C bus and from there it is inverted. In the proposed controller the voltage stability is obtained by applying Honey Bee (HB) optimization algorithm along with a PI controller. The implementation of the proposed method is done by using Simulink platform. The performance of the suggested co ordinated control system is analyzed by comparing the computer simulation results with and with out using controllers and it shows that the proposed system is more efficient.
    Keywords: Hybrid Power System ; Distributed Generation(DG); Honey Bee algorithm; PI; Wind and solar energy.

    by Mohan Kumar JK, Abdul Rauf H, Umamaheswari R 
    Abstract: An optimization to the steady state performance of wide range of SC Converters made use in control of Photo Voltaic (PV) systems that aims to enhance its efficiency and Output Regulation performed by a Switched Capacitor Direct Current (SCDC) converter is proposed here. This uses many Power Converters to effectively transmit power via a large cable with greater Electro Magnetic Interference (EMI).Under no-load conditions this explained model shows that an Ideal DC voltage transformation is achieved by the converter and the loss during the conversion are altered because the drop in voltage due to the non-zero load current measured at the impedance at load side has lesser Electro-Magnetic Interference (EMI). The charging of capacitor and loss during discharging and the conduction loss at resistance is mainly due to the impedance of output resistance. Followed by this, our DC to DC converter is extended to a smart control technique to track Maximum Power Point (MPPT) of a PV system with constraints of varying temperature and irradiance along with ILS (Iterated Local Search).
    Keywords: Direct Current (DC) Converter; Photo-Voltaic (PV) System; Iterated Local Search (ILS) and Maximum Power Point Tracking (MPPT.

  • Artificial Intelligent Technology for safe driver assistance system   Order a copy of this article
    by Al Smadi Takialddin 
    Abstract: This paper mainly studies Artificial Intelligent Technology for safe driver assistance. Intelligent vehicle, (IV) the system the capacity of AI to control the cars is difficult to overestimate. For example, unmanned machines require onboard systems that can handle a huge amount of data from the surveillance cameras, sensors, navigators, sensors measure the distance, etc. Of course, to recognize and analyze thousands of requests, Speed limit transmitted from placed transmitters in the road, to a receiver mounted inside the car somewhere on the dashboard of the vehicle and is a digital display indicating the current speed limit and the current car speed. This gave the driver a time to worry about other driving limitations, which carry out a safety driving in roads and not exceed a speed limit in them. As a result of using this device, we avoid the car accidents caused by exceeding the speed limit, traffic fouls by radars, and traffic fines. The work in this paper which Could be developed by adding another transmitter at the car side, to send each fine with its time, date, and the car ID , to another receiver fixed on the road side, this step could make the documenting process of the car violations easier for the traffic department. We didnt do this step because the time period for achieving the project was not enough to do it.
    Keywords: Safe Driving; assistance system; intelligent vehicle; road conditions; Traffic.
    DOI: 10.1504/IJCAET.2020.10014759
  • PID Controller tuning using hybrid optimization technique based on Boxs evolutionary optimization and teacher-learner-based-optimization   Order a copy of this article
    by Vinay Pratap Singh, Naresh Patnana, Sugandh Pratap Singh 
    Abstract: In this paper, a hybrid optimization technique based on Boxs evolutionary optimization and teacher-learner-based-optimization (BEO-TLBO) is proposed for proportional-integral-derivative (PID) controller tuning of level control of three-tank system. The integral-square-error (ISE) of unit step response is minimized for obtaining optimal controller parameters. The ISE is designed in terms of alpha and beta parameters. In BEO-TLBO, a global search is first carried out over the entire search space to determine the set of desired controller parameters using teacher-learner-based-optimization (TLBO). The search is then refined in the second stage using Boxs evolutionary optimization (BEO). The results obtained using BEO-TLBO are compared with other existing techniques. Computer simulations reveal that the hybrid optimization based approach meets the desired specifications with greater accuracy as compared to the other existing methods.
    Keywords: Box’s evolutionary optimization (BEO); ISE; Optimization; PID controller; Teacher-learner-based-optimization (TLBO).

  • Hidden Object Detection for Classification of Threat   Order a copy of this article
    by Gautam KS, Senthil Kumar Thangavel 
    Abstract: The automated video surveillance has become important due to the focus from government and users for improving the smart nature of the buildings. A system developed for handling this can be used for prison, airport, banks etc. Though there are solutions for this they fail in situations of mishaps and objects that are hidden that could become a threat to the environment. In this paper a framework has been built using Modified K Means Segmentation Algorithm to detect hidden objects. The framework operates in two phases. Phase 1-Modified K Means Segmentation Algorithm for segmenting the hidden objects. Phase 2- Deep Convolutional Neural Network for classifying the hidden object The algorithm selects searched for the approximately optimal value of K and segments the object. The result of the algorithm is given to Deep Convolutional Neural Network for classifying the type of object. The algorithm is tested with manually built dataset using Fluke Tis40 Thermal Imager. The experiments were carried out in batches of 50*50 images and the performance of the approach is presented using Top-1 Accuracy and Mean Average Precision and they are 0.94 and 0.64 respectively. From the experimental analysis, we infer that the proposed algorithm works with precision 0.88 false discovery rate 0.12.
    Keywords: Video Analytics; Deep Learning; Deep Convolutional Neural Network; Thermal image; K-Means Segmentation.

  • Proposed Variants of Charged System Search Algorithm for Location Area Optimization in Mobile Wireless Communication Networks   Order a copy of this article
    by Palaniappan Senthilnathan, Ameer John Sirajudeen, Venkatachalam Ilayaraja, Meenakshisundaram Iyapparaja 
    Abstract: Location area optimisation is used to diminish the location update cost and paging cost in mobile wireless communication networks. Retaining heuristic optimisation technique, helps to diminish the location and paging cost, the problem occurs in this technique is combinational optimisation in nature. Handiness of mobile users growing day by day and many users will be allocated to various mobile subscribers and thus forecasting the ideal area is always a big job. Charged system search algorithm (CSSA) is employed to overwhelm the local and global minima that happened often during the peers of the run process. The variants introduced into the CSSA include the wavelet models and the gravitational search algorithm (GSA) models. The prominent features of both wavelet model and GSA model are obtained and are shared with the charged system search algorithm to minimise the total cost experienced for location area optimisation in mobile wireless communication networks (MWCN).
    Keywords: heuristic optimisation technique; wavelet models; charged system search algorithm; CSSA; gravitational search algorithm; GSA; mobile wireless communication networks; MWCN.

Special Issue on: Image Processing in Computer Vision - Techniques and Advancements

  • Automated extraction of dominant endmembers from hyperspectral image using SUnSAL and HySime   Order a copy of this article
    by Nareshkumar Patel, Himanshukumar Soni 
    Abstract: Linear Spectral Unmixing (LSU) is widely used technique, in the fieldrnof remote sensing (RS), for the accurate estimation of number of endmembers,rntheir spectral signatures and fractional abundances. Large data size, poor spatial resolution, not availability of pure endmember signatures in data set, rnmixing of materials at various scales and variability in spectral signature makes linear spectral unmixing as a challenging and inverse-ill posed task. Mainly there are three basic approaches to manage the linear spectral unmixing problem: Geometrical, Statistical and Sparse regression. First two approaches are kind of blind source separation (BSS). Third approach assumes the availability of some standard publically available spectral libraries, which contains spectral signatures of many materials measured on the earth surface using advance spectro radiometer. The problem of linear spectral unmixing, in semi supervised manner, is simplified to finding the optimal subset of spectral signatures from the spectral library known in advance. In this paper, the concept of soft thresholding is incorporated along with the sparse regression for automatic extraction of endmember signatures and their fractional abundances. Our simulation results, conducted for both standard publically available synthetic fractal data set and real hyperspectral data set, like cuprite image, shows procedural improvement in spectral unmixing.
    Keywords: Spectral Unmixing; Sparse Unmixing; Hyperspectral Unmixing;Alternating directional Method of Multiplier; ADMM; Hysime;.
    DOI: 10.1504/IJCAET.2020.10010900
  • Feature Extraction and Classification of COPD Chest X-ray Images   Order a copy of this article
    by P. Bhuvaneswari Samuel, A. Brintha Therese 
    Abstract: COPD (Chronic Obstructive Pulmonary Disease) is a group of lung disease like Emphysema, Chronic bronchitis, Asthma and some kinds of bronchiectasis .This group of diseases are expected to be one of the major cause of morbidility and the third case of mortality by 2020. Many people with COPD also develop lung cancer likely due to a history of smoking cigarettes. India contributes highest COPD mortality in the world. If the disease is identified in the early stage itself the survival rate will be increased. In this paper a novel method is proposed to classify the disease COPD in chest x-ray images. Prior to classification essential features to be extracted. In this regards some structural features include no of ribs in the chest x-ray , heart shape, diaphragm shape, distance between ribs of the given x-ray image are extracted by means of various image processing techniques. Based on the above said features the input image is classified as normal or COPD with various classifiers include MLC, LDA, Neural Network, Genetic Algorithm.600 x-ray images (PA view) are tested with the proposed method and classified based on the above features. The maximum classification accuracy achieved is 97.9% .Based on the comparison results of different classifiers, Genetic Algorithm based classification method proved to have more accuracy. This work not only ends up with the classification of COPD images, it also enables the medicos to identify the heart disease cardiomegaly.
    Keywords: COPD; Adaptive histogram equalization; Hough transform; Zernike moments; classification; MLC; LDA; Neural Network; Genetic Algorithm.
    DOI: 10.1504/IJCAET.2020.10010445
    by Koppola Mohan 
    Abstract: The Object Face Liveness Detection for Genuine face recognition and user authentication is a difficult task and day to day it becoming an interesting tricky in real time vision and security applications. Since many decades, various authors have proposed new technique and methods and developed but still the system has to improve to recognise the genuine object faces from spoofing objects with increasing in accuracy. However, by considering the various existing methods and techniques, were fails in finding of genuine objects from various shapes of object and individual differences between the objects. The ordinary classifier cannot simplifies well to various kind of objects in different directions especially in case of blur images. In order to overcome this problem, we proposed an Object-Specific Face Authentication System for Liveness Detection using Combined Feature Descriptors with Fuzzy based SVM Classifier, allows to select specific area from whole object, extract features from specific area of object leads reduction in processing time and complexity in feature extraction. Later the system recognises respective faces, finally it checks for live objects with the help of Fuzzy logic based SVM classifier. With these proposed Object-Specific Face Authentication System for Liveness Detection using Combined Feature Descriptors with Fuzzy based SVM Classifier makes it practical to train well performed individual Object to its certain face with liveness detection and got improvement in performance and accuracy.
    Keywords: Object-Specific Face; Genuine Object; Spoofing objects; Liveness Detection; Authentication; Anti-Spoofing; Feature Extractors; Region of Interest; HOG-LPQ Descriptors and Fuz-SVM Classifier.

  • Event Recognition and Classification in Sports Video Using HMM   Order a copy of this article
    by VIJAYAN ELLAPPAN, Rajkumar Rajasekaran 
    Abstract: Sports event recognition and classification is a challenging task due to the number of possible categories. On one hand, how to characterize legitimate occasion classification names and how to acquire preparing tests for these classes should be investigated; then again, it is non-inconsequential to accomplish acceptable order execution. To address these issues, we propose the use of the spatio-temporal behaviour of an object in the footage as an embodiment of a semantic event. This is accomplished by modelling the evaluation of the position of the object with a Hidden Markov Model(HMM). Snooker is used as an example for this purpose of research. The system firstly parses the video sequence based on the geometry of the content in the camera view and classifies the footage as a particular view type. Secondly, we consider the relative position of the white ball on the snooker table over the duration of a clip to embody semantic events. The temporal behaviour of the white ball is modelled using a HMM where each model is representative of a particular semantic event.
    Keywords: HMM; Event Recognition.

  • Cursive Script Identification using Gabor features and SVM classifier   Order a copy of this article
    by Mohammed Aarif K.O, SIVAKUMAR PORURAN 
    Abstract: Script identification is one of a challenging segment of optical character recognition system for bilingual or multilingual document image. Significant research work have been noted on script identification in the last two decades which highly concentrated on natural languages like Latin, Chinese, Hindi, French and so forth. A very little efforts are made on script identification of cursive languages like Arabic, Urdu, Pashto, etc. Most of the Urdu ancient literatures which are yet to be digitized includes both Urdu and Arabic text. In this paper we present a script identification of Urdu and Arabic text at word level using Gabor features with suitable orientation and frequencies. The proposed model is trained using SVM classifier and the results achieved are very promising
    Keywords: Script identification;cursive language; character recognition; Gabor filter; SVM.

  • Improving Microaneurysm Detection from Non-dilated Diabetic Retinopathy Retinal Images using Feature Optimization   Order a copy of this article
    by Akara Thammastitkul, Bunyarit Uyyanonvara, Sarah Barman 
    Abstract: Diabetic retinopathy usually not presents symptoms in an early stage until it goes in a severe stage. An early stage of diabetic retinopathy is associated with the presence of microaneurysms (MAs). The occurrence of blindness can be reduced significantly if MAs are detected. This paper presented an approach to improve automatic MAs detection using feature optimization. Candidate MAs are detected using mathematic morphological. Original 20 features are present. To verify the relevance of all original features, feature optimization process is performed. The optimal feature set is searched by machine learning approach, like na
    Keywords: Diabetic retinopathy; Microaneurysms; Machine learning approach; Feature optimization.

  • A new and efficient approach for the removal of high density impulse noise in mammogram   Order a copy of this article
    by Sreedevi Saraswathy Amma, Elizabeth Sherly 
    Abstract: This paper proposes a combined approach for removing impulse noise from digital mammograms which implement a detection followed by filtering mechanism, in which, detection is done using a robust local image statistical measure called Modified Robust Outlyingness Ratio (MROR) followed by a filtering framework based on Extended Nonlocal Means (ENLM). All the pixels in the image are grouped into four different clusters based on the value of MROR. The detection system consists of two stages, coarse stage and fine stage. In each stage, different decision rules are adopted to detect the impulse noise in each cluster and to restore the image, the value of the noisy pixels is replaced with the modified median based value of the corresponding window based on the cluster position. For filtering, the NL-means filter is extended by introducing a reference image. Simulations are carried out on the MIAS database and the performance of the proposed filter has been evaluated quantitatively and qualitatively through experimental analysis and the results are compared with several existing filters such as Standard Median Filter (SMF), Adaptive Median Filter (AMF), Robust Outlyingness Ratio Non Local Means (ROR-NLM) and Modified Robust Outlyingness Ratio Non Local Means (MROR-NLM)
    Keywords: Impulse noise; image denoising; Non-Local means filter; noise detector; ROR; adaptive median filter; coarse stage; fine stage; MROR-ENLM.

  • Improved Motion Estimation Algorithm Based on Integrity Index and Its Implementation in x265   Order a copy of this article
    by Vidya More, Mukul Sutaone 
    Abstract: With the development of fast motion estimation (ME) algorithms to-rnwards video compression standard H.264, burden on integer-pel ME is slightly re-rnduced. However the integer-pel ME is becoming computationally complex due torndemand of higher and higher resolution videos. On the other hand, there is alwaysrntradeo between the speed and performance of these search algorithms. This workrnimprovises the performance of content awareness enabled integer-pel ME algorithmrnfor encoding the fast and slow motion video sequences of High De nition (HD)rnresolution category viz. 1280720 and 19201080. The algorithm proposes a novel notion of `Integrity Index 'with focus on increasing the PSNR and reducing the Bit-Rate. It is implemented in the frame work of x265 version 1.7 video encoder.rnMotion independent ME algorithm is analyzed quantitatively in terms of pa-rnrameters based on rate distortion and ME time. The proposed algorithm is foundrnto be performing better on BD-Rate and BD-PSNR parameters for the videos ofrnboth resolutions under consideration. Observed increase in motion estimation time is less than four seconds compared to hexagonal search algorithm of ME which is the benchmark in fast ME algorithm.
    Keywords: Integer-pel; Motion Estimation; x265; HEVC; Content Awareness.

  • Markov random field classification technique for plant leaf disease detection   Order a copy of this article
    by Anusha Rao, Shrinivas B. Kukarni 
    Abstract: In recent era of technology, computer vision technique has grown attraction of the researchers. This technique helps to identify and classify the objects according to the application requirement. These techniques are widely used for plant leaf detection and helping to develop an automated process for plant leaf disease detection. A new approach is developed in this work for plant leaf disease detection using Markov Random Classification technique. MRF based problem is formulated for disease detection. In the next stage, the general stages of computer vision classification model i.e., pre-processing and feature extraction is applied. For pre-processing, noise removal and image enhancement models are applied and feature extraction is combination of statistical features. Neighborhood pixel modeling and MRF classification models are applied to obtain the classification of input data. Performance of three classification models is compared. Study shows that proposed approach gives robust performance for plant leaf disease detection and classification
    Keywords: plant leaf; plant disease; computer vision; Markov random field; MRF.

Special Issue on: Recent Trends and Developments of Computer Vision and Image Processing

  • An Approach for Infrared Image Pedestrian Classification based on Local Directional Pixel Structure Elements' Descriptor   Order a copy of this article
    by S. Rajkumar 
    Abstract: Pedestrian classification is a major problem in infrared (IR) images due to lack of shape, low signal-to-noise ratio and complex background. And it find applications in agriculture, forestry, night vision monitoring system, intelligence system and defense system. In this paper, local directional pixel structure elements' descriptor (LDPSED) based pedestrian classification approach is proposed to overcome these problems. In addition, for segment the objects (pedestrian and non-pedestrian) from an IR image interest point detection approach is proposed. The proposed method consists of three steps segmentation, feature extraction and classification. Firstly, objects are segmented from the input image. Secondly, the feature extraction is carried out on the segmented objects. Finally, support vector machine (SVM) is implemented for classification of objects in IR image into pedestrian and non-pedestrian. To prove the effectiveness of the proposed approach, we have conducted experimental test on the standard OTCBVS-BENCH-thermal collection over the OSU thermal pedestrian database. In addition, the classification results of the proposed approach is comparedrnwith the existing approaches. The efficiency of the proposed approach proved by high classification accuracy.
    Keywords: Infrared image; Local directional pattern; Structure element descriptor; Support Vector Machine; Pedestrian classification.

  • An Efficient Image System based Grey Wolf Optimizer Method for Multi Media Image Security using Reduced Entropy Based 3D Chaotic Map   Order a copy of this article
    by SRINIVAS KOPPU, Madhu Viswanatham V 
    Abstract: Chaotic maps plays an important role in information sharing. In this paper a Grey Wolf Optimizer used with reduced entropy based 3D chaotic map. The selection and High coefficients are selected based on the reduced entropy value to identify the optimized parameters to get unpredictable random values. Time complexity, Autocorrelation of V, H and D elements, Histogram of original and cipher images, Peak signal to noise ratio and NPCR and UACI values are computed from the cipher image. The empirical results show the proposed method provides good, better imperceptibility and defends various attacks. To prove this accomplishment of the method, several experiments were conducted and compared the results with existing systems.
    Keywords: Encryption/Decryption; Chaos; 3D Choatic map; Entropy; Grey Wolf Optimizer.

  • Caliber fuzzy c-means algorithm applied for retinal blood vessel detection   Order a copy of this article
    by Gowri Jeyaraman, Janakiraman Subbiah 
    Abstract: Retinal blood vessel detection employs a vital role in finding of Retinal diseases like diabetic retinopathy and glaucoma. This paper presents an innovative unsupervised retinal blood vessel detection technique. First step is to generate a vessel enhanced image, then using Caliber Fuzzy C-means (CFCM) technique, first cluster the Retinal image; next the clustered image is passed to the canny edge operator and finally post process the retinal image. CFCM clustering method for blood vessel detection is based on the choice of the number of clusters value. By using CFCM clustering function, compute the cluster center, which commonly divides the image into four clusters. The proposed technique is obviously forceful into the modification of fuzzy c-means with canny algorithm. The proposed algorithm accomplishes an accuracy of about 95% of retinal images from three data sets DRIVE, STARE, and CHASE_DB1.
    Keywords: fuzzy c-means clustering; retinal image; self organized map.

  • Effective Image Stego Intrusion Detection System using Statistical Footprints of the Steganogram and Fusion of Classifiers   Order a copy of this article
    by Hemalatha J 
    Abstract: Enlightening the processing record of a digital image is a significant problem for steganalyzers and the forensic analyzers. In the present day, the most precise steganalysis techniques are built as supervised classifiers by extracting the feature vectors from the digital media. This paper presents an ensemble classification method for effective image stego intrusion detection system on JPEG images consists of two step process. In the first step the features are engineered as higher-order statistics for blind steganalysis. In the second step ensemble classifier is used by fusing the classifiers such as support vector machine, neural networks, k-nearest neighbors. By applying the mentioned classifiers to these features, the steganogram and the clear (unadultered) carrier signals are effectively discriminated. For generating the image dataset, images are undergone with six embedding schemes with different payload. Experimental results show that the proposed approach remarkably improve the metrics such as specificity, sensitivity and accuracy (94%) of the system.
    Keywords: SVM; Ensemble; Higher order statistics.

  • Incipient Knowledge in Protein Folding Kinetics States Prophecy Using Deep Neural Network based Ensemble Classifier   Order a copy of this article
    Abstract: In this paper, we focus on incipient knowledge in the prediction of protein folding kinetics states using deep neural network based stacking technique in ensemble classifier. Protein folding procedure is highly crucial for deciding the molecular function. The protein folding kinetic states check whether particle stimulus structure has done with the intermediary or not. Folding structure can be done with the stable intermediary (3S/3States) and without stable intermediary (2S/2State). Furthermore, there is a vast number of proteins in PDB still unfolding mechanism are found unknown. In this paper, we proposed stacking with the deep neural network for predicting protein folding kinetics states. In first level learning, we have used five base classifier, i.e., naive bayesian, decision tree, random forest, support vector machine and neural network and in the second level meta-learning we have used the rule based method and deep neural network based stacking in ensemble classifier for increasing the accuracy.
    Keywords: protein folding; two states; multi states; deep neural network; stacking; ensemble classifier;.

    by Balamurugan P, Viswa Bharathy AM, Marimuthu K, Niranchana R 
    Abstract: The cancer disease is posing a big challenge in the field of pathological diagnosis. The feature selection of cells is highly important in isolating the affected cells. The classification of cancer cells is gaining importance among clinical researchers. Gene Expression Profile (GEP) is used in better classifying genes in a cell or tissue. Gene Expression Data (GED) differs for every gene from which cell or tissue it is originated. Based on the GED the cancer cells can be classified into seven categories from which cell or tissue it was born. The infected cells can be graded from level one to four based on its growth and difference from other unaffected cells. Many techniques have been developed in the past for classifying cancer affected genes. In this paper we propose a modified classification algorithm Bi-Layer Mutated Particle Swarm Optimization (BLMPSO). The microarray dataset used for testing the method is Affymetrix Human Genome U95Av2 Array. The simulation results showed that the proposed technique performs better in terms of classification based on GED than the other existing methods.
    Keywords: cancer cells; feature selection; classification; gene expression; mutation; PSO.

  • Effective user preference mining based personalized movie recommendation system   Order a copy of this article
    by Subramaniyaswamy V, Logesh R, Vijayakumar V, Hamid Reza Karimi, Marimuthu Karuppiah 
    Abstract: One of the primary issues of many websites is the suggestion of multiple choices to the users at the same time, which makes the task more complex and time consuming to find the right product. Web mining and recommendation system based on user behavior helps users by providing essential information without asking explicitly. Several movie recommendation systems are available to suggest movies, but often they dont do that effectively. To achieve enhanced effectiveness and efficiency, users movie ratings were retrieved, cleaned, formatted and grouped into proper, meaningful session and data profile was developed. In this paper, we have developed a new ontology for clear and better understanding of the movie domain. The user data consisting of movie ratings is used to recommend movies for the users. For the classification of users, we use Adaptive K-Nearest Neighbor (AKNN) approach and post classification process, movies are recommended to the active target user. The obtained results of the proposed recommendation approach are compared with existing baseline methods, and the results prove that the presented approach to be proficient.
    Keywords: Recommender Systems; Personalization; Adaptive kNN; Ontology; Web Mining; Classification.

  • Steganographic approach to Enhance the Data Security in Public Cloud   Order a copy of this article
    by Prabu S, Gopinath Ganapathy 
    Abstract: Steganography is the claim to fame of disguising how correspondence is happening, by hiding information in other information. As picture encryption is a quickly creating innovation in the field of picture preparing, it can be portrayed as the strategy for encoding messages and data especially to such an extent that it can be gotten to ensured elements as it were. The paper manages an overview which deals with delineation enigma, along with its applications as well as techniques. It moreover endeavors so that it can recognize the essentials of a respectable enigma figuring as well as swiftly considers the steganography procedures that are better suitable for specific applications. Information transmission crosswise over systems is a typical practice according to the advancement of Internet and sight and sound advances that develops exponentially today. The paper displayed about the mystery sharing of the message by concealing it in a picture utilizing the most ordinarily utilized LSB (Least Significant Bit) method. Here, steganography went for concealing the information imperceptibly inside any media (picture, sound, and video) so it ought to be unnoticeable to the unintended individual and in this manner accomplishing secured applications.
    Keywords: Encryption; Decryption; Least Significant Bit; Bitmap Steganography; Data Security; Cloud Computing.

Special Issue on: Applications of Computer and Engineering Technology in Enabling Technologies and Industrial Case Studies

  • Impact of the Lossy Image Compression on the Biometric System Accuracy: A Case Study of Hand Biometrics   Order a copy of this article
    by Djamel Samai, Abdallah Meraoumia, Mouldi Bedda, Abdelmalik Taleb-Ahmed 
    Abstract: Biometric recognition systems use in several cases, images to authenticate or identify persons. Storing of large images require large storage space. To reduce the storage space, compression methods are employed. In this paper, we analyse the effect of lossy image compression on the performance of biometric identification systems. We propose a scheme to evaluate the Multi-Spectral Palmprint (MSP) and Finger-Knuckle Print (FKP) recognition performance at low bitrates. The images are compressed using Set Partitioning In Hierarchical Trees (SPIHT) encoding. A powerful texture descriptor is used to represent the extracted features of different images. It is based on quantising the phase information of the local Fourier transform, which leads to computationally efficient and compact feature representation. We have employed the Nearest Neighbour (NN) classifier, or the nonlinear multiclass Support Vector Machine (SVM) model to classify the feature extraction. The obtained results show that the compression does not significantly affect the performance of recognition systems at low bitrate. Thus, the low bitrate MSP or FKP images perform equivalent to the higher bit rate images in recognition system. In order to improve the proposed recognition systems and confirm the results found in the unimodal recognition systems, we made efficient multimodal recognition systems by fusion modalities at matching-score level. Experiments of identification proved the superiority of fusion modalities to every single modality.
    Keywords: Biometrics; MSP; FKP; SPIHT; ML-LPQ; Matching-score level.

  • An Extended Infrastructure Security Scheme for Multi-Cloud Systems with Verifiable Inter-Server Communication Protocol   Order a copy of this article
    by Vijay Gr, A. Rama Mohan Reddy 
    Abstract: Cloud service providers need to have elevated level of security than local system users as users outsource their data to a remote system relying on its genuineness and also data has to be protected from intruders that exist globally. Multi-cloud systems are more prone to various kinds of attacks due to numerous internal communication of sensitive data. Multi-cloud infrastructure security can be enhanced by using stringent encryption algorithms or protocols with several integrity checks on each transaction, but that may drastically reduce the efficiency of the system. This research is one of a kind where we propose a multi-cloud framework comprising different techniques for data security as well as internal server communication security that can efficiently secure the privacy of data from eavesdroppers. Use of efficient cryptographic algorithms, modified Diffie-Hellman key exchange scheme and fragmentation technique assure overall efficiency of system along with high security. The newest variant of SHA i.e. SHA-3 outperformed other variants in terms of time efficiency and security. Results on multi-cloud setup demonstrates efficiency of proposed framework in terms of time taken for data storage, computations and retrieval while eliminating risks of attacks.
    Keywords: Cloud Computing; Multi-Cloud; Data Security; Cryptography; Key Exchange Protocol.

    by Chetan Huchegowda, Indumathi G, Naveen Huchegowda 
    Abstract: To minimize the storage space and for the fast transfer of the digital images, it is necessary for the medical images to undergo image compression. There are various techniques in which the images are being diagnosed, based on that the image compression is to being performed. The choice of the filters in the image compression is an issue, which affects the quality of the image. Hence, a novel biorthogonal filter using the lifting scheme has been developed. The proposed architecture gives the same characteristics of second generation wavelets. The proposed architecture is designed using MATLAB for different medical images and the PSNR, SNR, MSE, BPP and the Compression Ratios values are calculated. Finally the proposed lifting scheme architecture is designed using Verilog to obtain the details of area, delay and power.
    Keywords: DWT; DTDWT; medical Image Transmission ,Biorthogonal Filter.

  • MOEMS Based Accelerometer Sensor Using Photonic Crystal for Vibration Monitoring in Automotive Systems   Order a copy of this article
    by SUNDAR SUBRAMANIAN, Gopalakrishna K, Thangadurai N 
    Abstract: Diagnosing the vibration in automobiles has got great priority since it provides comfort to the passenger inside vehicle. This paper presents MOEMS accelerometer sensor by using photonic crystal. Spring mass system with photonic crystal technology is visualized and scrutinized. Optical sensing system with photonic crystal technology studied and simulated with rods in air and holes in slab configuration. Due to applied force deflection of rectangular defect slab for vertical and horizontal movement is verified. Gaussian pulse propagated through the defect region in photonic crystal slab was resulting wavelength shift for each defection of slab. Transmission spectrum obtained for each deflection direction of slab and configurations. Q factor analyzed for each displacement of slab found to be 3210 for HIS vertical movement. It is found that distinct change in wavelength has obtained for Holes in slab configuration during vertical and horizontal movement of slab compared to the results of Rods in air configuration. Obtained results showed feasibility of future fabrication for HIS configuration
    Keywords: Photonic crystal; Accelerometer; Rods in Air (RIA); Holes in slab (HIS); Vibration; Micro displacement; Light Propagation; Q-factor; Monitoring,MOEMS.

    by SASIREKHA VENKATACHALAM, Ilangkumaran Mani, Arulmurugan Loganathan 
    Abstract: In Heterogeneous wireless environment, network selection is a strategic issue and has a significant impact on providing the best Quality of Service (QoS) to the users. The selection of an apt network among various alternatives is a kind of Multi Criteria Decision Making (MCDM) problem. This paper proposes a model based on VlseKriterijumskaOptimizacija I KompromisnoResenje (VIKOR) under fuzzy environment and Grey Relational Analysis (GRA) for the selection of suitable network in heterogeneous wireless environment. Here, Triangular fuzzy linguistic variables are used to handle the vagueness and subjectivity of the decision making process. This study focuses on four alternatives such as Universal Mobile Telecommunications System (UMTS), Long Term Evolution (LTE), Worldwide Interoperability for Microwave Access (WiMAX), wireless local area network (WLAN) and six evaluation criteria such as throughput, delay, jitter, packet loss, cost, security are considered for choosing the suitable network in heterogeneous wireless environment. An efficient pair-wise comparison process and ranking of alternatives can be achieved for optimum network selection through the integration of Fuzzy with GRA and VIKOR. The obtained preference order of the network for Fuzzy-GRA-VIKOR and Fuzzy VIKOR are LTE>WiMAX> WLAN> UMTS and LTE>WiMAX> UMTS > WLAN respectively. Hence by comparing both these methods LTE is selected as the best network among the various alternatives. The combination of Fuzzy-GRA-VIKOR techniques will be assessed and suggested for each type of network traffic classes. This paper highlights the significance of MCDM techniques for network selection problem in heterogeneous wireless environment.
    Keywords: Multi Criteria Decision Making; MCDM; VlseKriterijumskaOptimizacija I KompromisnoResenje; VIKOR; Grey Relational Analysis; GRA; Fuzzy sets; Network selection.

  • An empirical analysis of the statistical learning models for different categories of Cross Project Defect Prediction.   Order a copy of this article
    by Lipika Goel, Mayank Sharma, Sunil Kr Khatri, D. Damodaran 
    Abstract: Currently, research community is addressing the problem of defect prediction with the availability of project defect data. The availability of different projects data lead to extend the research on Cross projects. Cross project defect prediction has now become an accepted area of software project management. Various defect prediction models have been applied on cross project data, focusing on the analysis and applications to evolve software reliability. In this paper, empirical study is carried out to investigate the predictive performance of available within project and cross project defect prediction models. Furthermore, different categories of cross project data are taken for training and testing to analyze various statistical models. Further in this study, k-fold cross validation is done on the training datasets to evaluate the training accuracy of the models. In this paper data models are analyzed and compared using various statistical performance measures. The findings during the empirical analysis of the data models state that Gradient Boosting predictor outperforms in the cross project defect prediction scenario. Results also infer that cross project defect prediction is comparable to within project defect prediction with statistical significance. Thereby we can say that even if little or no data exists for within project data then data from cross project can be considered for defect prediction.
    Keywords: defect prediction; cross projects; within-project,machine learning.

  • Parameter extraction of PSP MOSFET model using Particle Swarm Optimization   Order a copy of this article
    by Amit Rathod, Rajesh Thakker 
    Abstract: System on Chip (SoC) architecture offers performance acceleration by offloading compute-intensive functions in FPGA logic, together with application specific instruction set processor (ASIP). In this paper, we report a novel approach of SoC implementation for MOSFET parameter extraction at hardware level using Xilinxs Zynq 7000 SoC AVNET ZedboardTM platform. Parameter extraction of PSP MOS model of 65 nm technology devices has been carried out using Particle Swarm Optimization (PSO) algorithm. It is demonstrated that the SoC implementation of PSO algorithm is able to accurately minimize the rms error between model generated data and experimental data below 9.5%. ARM Cortex A9 processor inside the Zynq 7000 SoC, found capable to execute the MOSFET Model library. It has been observed that proposed SoC implementation of PSO algorithm runs 3.68 times faster compared to software based approach.
    Keywords: Evolutionary algorithm; MOSFET parameter extraction; Particle Swarm Optimization; PSP MOSFET model; System on Chip; Zynq 7000 SoC.

  • Dye sensitized solar power generating window: towards environmentally sustainable energy efficiency in ICT   Order a copy of this article
    by Zulfiqar Ali Umrani, Mehboob Khatani, Mohammad Aslam Uqaili, Norani Muti Mohamed, Nor Hisham Hamid, Bhawani Shankar Chowdhary 
    Abstract: Information Communication Technology (ICT) equipment generates significant amount of Green House Gas (GHG) which can be reduced via utilization of solar energy. Presently, the ICT produces more than 830 million tons of carbon dioxide (CO2). That is about 2 percent of global CO2 emissions, and it is expected to double by 2020 [1][2][3]. There is a need for ICT to first standardize, energy consumption and emissions and then investigate means to reduce the energy consumption via efficiency and innovation. The electricity consumption which dominates the direct carbon footprint of the ICT sector can be reduced by using renewable energy sources. Solar cells that operate efficiently under diffuse lighting are of great practical interest as they can serve as electric power sources for portable electronics and devices for wireless sensor networks and Internet of Things. This property allows them to operate and generate power inside the built environment. The dye sensitized solar cell (DSC) is a green and renewable energy device that works well in low light conditions. The transparent characteristic of DSCs makes it suitable for building integrated photovoltaic (BIPV) applications such as window systems. In this study, we fabricated and assembled a transparent power generating window of active area 0.228 m2 based on dye sensitized nanocrystalline TiO2 solar module that generates ~1.4A current and 5.8V open circuit voltage at 60 mW/cm2 and ~0.5 A short circuit current and 5.3 V at 33 mW/cm2 that was installed in a building environment to power up the ICT products. The device was connected with an ICT equipment and tested. The DSC successfully powered the ICT system. Such building integrated DSC systems can potentially power ICT devices in homes and offices.
    Keywords: ICT; dye sensitized solar cells; sustainability; Building integrated photovoltaics.

Special Issue on: INCAMA2018 Research Advances in Mechanical Engineering

    by Sonia S Raj, Pradeep P, Edwin Raja Dhas John 
    Abstract: Polymer composite fabricated based on natural fibers gains popularity since they poses numerous advantages in auto-motives. But natural fibers suffer from lower strength which can be over come through hybridization with stronger synthetic fibers like carbon or glass. This work explodes the potential reinforcement of fibers from palm leaf stalks for fabricating polymer composites. The fibers were pre-treated with glass fibers to form as fiber mats and hybridized with glass fibers prior to reinforcement. These mats were reinforced in the polyester resin matrix as different layers to form a hybrid composite. Experiments were done by varying the length of fiber, fiber volume and with different treatments using response surface methodology. Tensile strength was measured as response. The tensile strength was spotted maximum in 8% potassium permanganate (KMnO4) treated palm fibers with optimum fiber length 40 mm and volume fraction of 20%. The surface study on these composites through SEM (Scanning Electron Microscope) examination was satisfactory. Hence this specimen combination was best suited and recommended to manufacture components like car bonnet, bumper etc
    Keywords: Parameters; Composite; response surface methodology.

Special Issue on: ICIMIA 2017 Innovative Computer-Aided Techniques for Future Wireless Applications

  • Divide-by-16/17 dual modulus prescaler design with enhanced speed in 180nm CMOS technology   Order a copy of this article
    by Uma Nirmal, V.K. Jain 
    Abstract: In this work, we propose a high-speed dual modulus divide by 16/17 prescaler Design IV with 8.9 GHz operating rate. It uses RE-3 type DFF in synchronous divide by 2/3 prescaler design and asynchronous divide by 8 counter design. It reduces: design complexity, capacitive loading and delay. The proposed Design IV shows better results in terms of both speed and power performance than other ratioed and ratioless divide by 16/17 prescalers.It is implemented in 180nm CMOS technology and consumes only 0.38 mW power from a 1V supply voltage. The speed of the new Design IV is improved by ∼53% compared with conventional circuit with operating frequency 5.8GHz.
    Keywords: divide by 16/17 dual modulus prescaler (DMP); TSPC; D Flip Flop (DFF); RE-0; RE-1; RE-2; RE-3; RE-4.
    DOI: 10.1504/IJCAET.2021.10018090
  • IOT enabled traffic sign recognition for safe driving   Order a copy of this article
    by Iwin Thanakumar Joseph 
    Abstract: In this paper we have designed and constructed an IOT based platform which can automatically send information about the road signs. Here, we will demonstrate the basic idea of how to set up a communication between the upcoming vehicle and the sign boards. This system will play an important role for the recognition and detection of specific locations like markets, schools, speed breakers, universities, hospitals, offices.., etc. Detecting and recognizing traffic signs is a challenging problem. Traffic sign recognition (TSR) is an issue of concern for the driver may be because of the speed in which they tend to travel at, especially on highways . We present a device that will detect the road sign with the help of IOT using a very simple logic. This paper provides an overview of the traffic sign detection with the help of the output generated by the IOT devices like NodeMCU. This addresses the problem of fast traffic sign recognition and detection to enhance safe driving. The proposed method includes the following stages. First, the connection between the server and the client is achieved when the client comes into the Wi-Fi zone created by the server. Second, the client and server enters into state of communication and in this stage the client receives the input provided by the server and provides the corresponding output. The third and final stage is when the audio alert is obtained based on the output of the client. The proposed approach can be very helpful for the development of a safe a driving environment.
    Keywords: Traffic Sign Recognition(TSR); NodeMCU; Wi-Fi zone - ESP8266; Audio alert -APR33A3.
    DOI: 10.1504/IJCAET.2021.10016782
  • A Hybrid SATS Algorithm Based Optimal Power Flow for Security Enhancement Using SSSC   Order a copy of this article
    by Kumar Cherukupalli, Padmanabha Raju Chinda, Sujatha Peddakotla 
    Abstract: Security and performance of the power system is the prime concern in its planning and operation. It is essential to devise proper measures for maintenance and improvement of security in the power system. Static Synchronous Series Compensator (SSSC) is a type of series flexible AC transmission system device. The present research proposes a hybrid simulated annealing and tabu search (Hybrid SATS) algorithm with SSSC to solve security constrained optimal power flow problem. The primary objective of the research work is to enhance the security of power system and minimize the generator fuel cost. Contingency ranking is used to select line outages. The line flow limit violations in various single line outages are relieved effectively by Hybrid SATS with SSSC method, which keeps power flows within their security limits. Simulation studies are carried out on standard IEEE 30 bus to identify effectiveness of proposed hybrid method and the obtained outcomes are put in comparison to SA with SSSC and TS with SSSC methods.
    Keywords: Security Constrained Optimal Power Flow; Simulated Annealing; Tabu Search; Static Synchronous Series Compensator.

  • HUPM-MUO: High Utility Pattern Mining under Multiple Utility Objectives   Order a copy of this article
    by Muralidhar A, PATTABIRAMAN V 
    Abstract: Mining the pattern of interesting items play a significant role in data analysis and decision-making strategies of real-time applications. Often the term "interest" in pattern discovery denotes the frequency of the pattern. In recent research domain of data mining is considering the utility of the item instead frequency, which indicates often profit. This manuscript argues that neither utility nor frequency of the itemset alone influence the target objective. Moreover, the profit is not only the utility factor of the itemset, apart from profit, the objectives like storage, saleability and other domain specific requirements can also be the utility factors. In regard to this argument, the manuscript endeavored to define a novel model that discovers the top-K high utility patterns under multiple utility objectives (HUPM-MUO). The experimental study was carried on various datasets, which portraying the performance advantage of the proposed model over the other contemporary models.
    Keywords: High Utility Itemset; Utility Mining; Rank Distribution Distance; Multi-Utility Objectives.

  • A Hybrid Approach to Diagnosis Mammogram Breast Cancer Using Optimally Pruned Hybrid Wavelet Kernel Based Extreme Learning Machine with Dragonfly Optimization   Order a copy of this article
    by Diderot. P. Kumara Guru, N. Vasudevan 
    Abstract: Breast cancer is one of the leading dangerous cancer types that may result in death. So, it is necessary to detect the cancer spot and provide early diagnosis which is termed as early detection. The detection of this type of cancer is difficult at initial stage because the cancerous tumors are rooted in the common breast tissue structures. The main objective of this research is to model a breast cancer prediction system with a novel machine learning approach based on wavelets is proposed. The prediction of breast cancer for diagnosis process is made by the proposed algorithm named as Hybrid Optimally Pruned Wavelet Kernel-based Extreme Learning Machine (HOP-WKELM). Initially, the input is pre-processed for noise reduction using kuan filter. After that, Quantum Evolutionary Algorithm (QEA) is applied to segment the cancer part in mammogram image and feature extraction using Grey-level co-occurrence matrix (GLCM), Gabor filter and Local Binary Pattern (LBP) features. The extracted features are classified using HWKELM classifier. In this HWKELM, WKELM learning algorithm utilized the Dragonfly Swarm Behavior-based Optimization (DSBO) approach to optimize the parameters of kernel functions. The proposed strategies achieved a maximum accuracy of 98.8% and a maximum precision of 98.1% when compared with existing Adaboost systems.
    Keywords: Breast cancer; Hybrid Wavelet Kernel-based Extreme Learning Machine; GLCM; Gabor; LBP and DSBO.

  • Hardware Implementation of modified SSD LDPC decoder   Order a copy of this article
    by Rajagopal Anantharaman, Karibasappa K, Vasundara Patel K.S 
    Abstract: In this work, a modification approach to the Simplified Soft Distance algorithm is discussed by considering soft Euclidean squared distance as a performance metric. The SSD algorithm is theoretically independent of the signal to noise ratio of the received signal. Multiplication and addition terms are the only constituents of this algorithm which reduces the complexity.In this paper, an attempt has been done to compare and analyse the performance of modified SSD with other popular algorithms such as SPA, SSPA, and LogSPA. The algorithm is implemented on Virtex-5 xc5vlx110t FPGA kit to observe the real time implications and draw apt conclusions. From the FPGA results, this paper aims to conclude the performance of modified SSD is similar to that of Log SPA with changes observed as improved throughput speed and improved bit error rate (BER).
    Keywords: Simplified Soft Distance (SSD); Field Programmable Gate Array(FPGA); Bit Error Rate(BER); Sum Product Algorithm(SPA); Simplified Sum product Algorithm(SSPA); Logarithmic Sum Product Algorithm (LogSPA); Low Density Parity Check Codes (LDPC).

    by Kumaran U, Neelu Khare 
    Abstract: Online Social Networks (OSN) has become highly popular, where users are more and more lured to reveal their private information. To balance privacy and utility, many privacy preserving approaches have been proposed which does not well meet users personalized requirements. Most social networks based data sources such as Twitter, Facebook etc., have unstructured data and no analytics or processing tools can work directly on this unstructured data. Commonly, users lack in data privacy and the access control mechanisms available to remove the risk of disclosure. Thus, the privacy preserving paradigm is required that automatically preserves the user privacy to find the sensitive attribute and reduce the risk of sensitive information leakage. In this paper, we present a Privacy Preserved Hadoop Environment (PPHE) which automatically detects sensitive attribute using data mining techniques. This work considers Twitter which enable users to post messages. The content of the posted tweets are wide ranging and contains private information such as email addresses, mobile numbers, physical addresses, and date of births. In this context, the purpose of our work is fourfold. First, we authenticate each twitter users using the integrated algorithm RSA and Elgamal Algorithm. Second, we categorize the tweets into private and non-private attributes based on Type-2 Fuzzy Logic System. Third, we apply data suppression technique for private tweets and finally sharing users content based on their similarity information. Content similarity has evaluated using Cosine Similarity. Finally we evaluate the system performance in terms of accuracy, precision, recall, and F-measure.
    Keywords: Privacy preserving Data Mining; Online Social Networks; Twitter; Data Mining Techniques.

  • SIBLAR: Secured Identity-Based Location Aware Routing Protocol for MANETs   Order a copy of this article
    by Suma R, Premasudha B G, Ravi Ram V 
    Abstract: A Mobile Ad-hoc Network (MANET) is a self-organizing distributed wireless network without any central infrastructure support wherein every participating node independently acts as a router. Several routing protocols are available for information dissemination in MANETs but their efficiency is limited due to security breaches. Providing security schemes for robust information dissemination is of high prominence for the real time deployment of MANET applications. From the existing literature it is understood that very few research efforts were made to ensure security for routing protocols and there is a huge scope for the design and development of secured routing protocols for MANETs. In this paper, we have considered the security issues with respect to Location Aided Routing (LAR) and proposed a Secured Identity Based Location Aware Routing (SIBLAR) protocol to achieve system security with improved key refreshment mechanism. MANET scenarios were created in ns2 and the efficiency of the proposed SIBLAR protocol was evaluated based on certain performance metrics. In the presence of security attacks, the proposed SIBLAR scheme is found efficient when compared to basic LAR.
    Keywords: MANET; Security Attacks; Identity-Based Security; Routing; LAR; Performance analysis.

Special Issue on: ICICT-2018 Ubiquitous Sustainable Systems

  • A close scrutiny of dApps and developing an E-voting dApp using Ethereum Blockchain   Order a copy of this article
    by Banupriya N, Pooja Guru, Nevetha S, Roopini J, Nivedhitha M 
    Abstract: In the expeditiously advancing technological world, with the advent of Internet Of Things and Big Data everyday new people are getting connected and new devices are connected that store highly sensitive personal data. For example your Google Home is listening to you and collecting data, your Facebook knows a lot much about you; Amazons Alexa gets to know our everyday wants. This implies that our data is used to spy on us, advertise us and hence these Siren Servers [1] make the overall profit off of us. In the initial juncture the ideology behind the Web 1.0 which was called the read only web was just to allow us to read content and search for information. It just had a jot of interaction. Web 2.0 emanated as a read write web. It grants users the ability to contribute content; in fact applications such as YouTube and MySpace run on the users contribution. But progressively some of the technology giants such as Google, Amazon, Facebook, Microsoft, Yahoo, Pandora, Spotify, Walmart, and Baiduu have taken over the internet. These companies are called as the Siren Servers. Thus there is no room for competition in the web 2.0. This induced the emergence of Web 3.0 which is also called the read write execute web. While Blockchain is customarily associated with Bitcoins and transactions, they could also be used to provide various other services. dApps[2] are one of the incredible solutions for all the problems with web 2.0. dApps crucially use Blockchain as their root. When the Blocks store code instead of transactions dApp becomes alive. This could be exploited in many sectors for innumerable reasons such as data permanence, ownership, and accountability. Since Blockchain [3] is the foundation for dApps, dApps also possesses properties like security, privacy, immutability, verifiability, auditability making it highly trustworthy. The forthright key to bring individuality out, to terminate the need of middleman in all the services that we obtain, to own your data, to have transparency; to resist censorship and yet to be profitable; is to make any system decentralized. This paper throws light on developing a dApp (decentralized application) that could be used in a voting system. Everything around us is digitizing; which doesnt mean we are moving towards a safe and secured scheme, thus the revolutionary Blockchain technology is used to deploy an E-voting system [4] and this is done using the Ethereum[5] Blockchain which uses Solidity[6] as its programming language.
    Keywords: Blockchain; Decentralized Application; Web 3.0; Ethereum; Smart Contracts; Solidity.

  • Performance Analysis for user Identification in CR Networks by various Modulation Transmission Techniques   Order a copy of this article
    by Anil Kumar Budati, MOHAMMED Saleem Pasha 
    Abstract: The rapid development of newly invented wireless devices and its applications tends to spectrum scarcity. Cognitive Radio (CR) is a technology, which gives solution for the spectrum scarcity problem with dynamic spectrum access. The user presence or absence is identified by a spectrum sensing technique in CR Networks. There are various methods like Energy Detection (ED), Matched Filter Detector (MFD) and etc., are used for identification of user presence or absence in the spectrum. The performance of the user identification is estimated by the parameters of Probability of Detection (PD) and the Probability of false alarm (Pfa). The performance of the spectrum sensing method with Basiean Detection (BD) criteria by using static threshold is estimated by existing author for the above said parameters. In this paper, the authors estimated the performance of the above parameters with Neyman Pearson (NP) detection criteria is applying to MFD sensing method by using dynamic threshold. The performance is analyzed by comparing the existing BD with the proposed NP by using the modulation transmission techniques 8-PSK, 8-QAM and identified the better detection criteria.
    Keywords: Cognitive Radio; Spectrum Sensing; Neyman Pearson Approach; Probability of false Alarm; Probability of Detection,PSK; QAM.

  • Visible Light Communication for Position control of Robotic Vehicle   Order a copy of this article
    by V. Partha Saradi, P. Kailasapathi 
    Abstract: With increasing demand in the need for fast and secured transmission of data from one place to another in a wireless medium, many innovations open up with respect to the use of communication protocols. Off late, spectrum availability appears to be a very scarce resource and invites alternate methods of transmission to fortify the emerging challenges. The paper investigates the properties of a visible light communication (VLC) medium and develops a scheme for controlling the position of a Li-Fi based robotic vehicle. The fast transmission properties of light signal along with its edge over radio frequency and or Wi-Fi communication augur to exploit the potentials of the Li-Fi environment. The methodology involves the use of an Arduino microcontroller for the generation and reception of control signal at both the transmitter and receiver ends respectively The response to varying widths of pulse position modulation (PPM) signals bring out the ability of the scheme to respond to changes in position at varying speeds. The results envisage a new dimension to the scope of robotic vehicles for a space in automated domains in terms of faster and precise operational ambits.
    Keywords: Visible Light communication; Li-Fi; Arduino controller; Radio Frequency; Speed control of motor.

  • The Impact of Work Integrated Learning Towards Students Learning: Case of ICT Students in South African Universities of Technology   Order a copy of this article
    by Bethel Mutanga Murimo 
    Abstract: The change in global structure regarding labour demand has, in recent years, led to a strong shift towards high-skilled workers. This trend has become a contributing factor towards increasing unemployment rates in South Africa. Consequently, Work Integrated Learning (WIL) has been introduced in the curriculum of South African universities of technology (UoTs) to bridge the gap between theoretical knowledge and real-life industry experience. To learn the contribution of WIL in improving the quality of graduates, it becomes highly imperative to investigate the impact of WIL with focus on specific discipline. Therefore, based on the Kolbs learning model, this paper investigated the impact of WIL on undergraduate ICT students learning. A quantitative survey instrument was adapted from existing scales and used on a sample of 76 ICT undergraduate students who had recently completed WIL. The results showed that over 90% of the students indicated that WIL significantly enhanced their learning.
    Keywords: WIL; experiential learning; workplace; innovative learning; skilled labour.

  • A Novel Automatic System for Logo Based Document Image Retrieval Using Hybrid SVDM-DLNN   Order a copy of this article
    by Raveendra Kudamala, Vinothkanna Rajendran 
    Abstract: Many government and private organizations represent themselves to the public using their own symbols or logo which is unique from others so that anyone can easily identify their products or belongings. This gives an ownership and source documentation to the owner by simply providing such logos. Using these logos for document retrieval in World Wide Web is a booming research in present era. Since usage of virtual documentation is increased day by day and handling this large data becomes a problem while searching for single data. In present research arena various document image retrieval models are available based on classification and clustering techniques. In this graphical techniques are used to identify the issues in the automatic logo detection model using back propagation neural network along with the single value decomposition model (SVDM). This proposed research model concerned about the document retrieval system based on the logo matching process to attain better efficiency and accuracy than the earlier detection models.
    Keywords: Logo recognition; detection; segmentation; Document retrieval; Feature extraction; Logo extraction,Feature matching.

  • An Evolutionary Frame Work on ADHD Diagnosis Based on Graph Theory and Ant Colony Optimization   Order a copy of this article
    by Catherine Joy R., Thomas George S., Albert Rajan A. 
    Abstract: Developing countries facing unavoidable issues for the parents lived with children due to Attention Deficit Hyperactivity Disorder (ADHD). This neuropsychiatric disorder has effects on the children in terms of inattentive, impulsive, and hyperactive. Graph theory provides useful description measures as predicted vectors for the classification process and this research work provides an automated diagnosis model for predicting the ADHD features based on the neural network classifier to differentiate ADHD patients and their healthy controls from a combined environment includes normal persons and affected patients. Ant colony optimization model is used to get converged results for the classifier results in terms of both phenotypic data and imaging data. ADHD-200 dataset is used for analysis in the proposed model. The experimental result yields an accuracy of 86% on two class diagnosis better than phenotypic approaches.
    Keywords: attention deficit hyperactivity disorder (ADHD); artificial neural network; ant colony optimization.

Special Issue on: ICIS 2016 Computer Aided Inventive Computational Techniques

    by P. Pitchandi, M. Rajendran 
    Abstract: With abundance of heat especially in tropical countries and Middle East nations, power harnessing techniques have been more focussed towards construction of solar panels and ponds for trapping and harvesting energy. This research paper is focussed towards the design considerations for a high efficiency harvesting of solar heat using solar ponds utilizing thermocouples. The design considerations discussed and implemented in this paper include the analysis of harvested power towards varying pond structures such as shallow water pond, medium and deep water ponds. A design of the experimentation layout has been presented in this paper. Data collected over a period of 30 days have been utilized to determine the percentage efficiency in the proposed construction. The final part of this paper proposes an optimized flash evaporator for electric power generation simulated and analysed for different working fluids which could be thought of as a future scope for large scale power generation.
    Keywords: Solar pond; design considerations; convection zones; flash separator.

  • A Novel Approach for Feature Fatigue Analysis using HMM stemming and Adaptive Invasive Weed Optimization with Hybrid Firework Optimization Method   Order a copy of this article
    by Midhun Chakkaravarthy 
    Abstract: Due to the rapid growth of customer product reviews in e-commerce website makes the new online customer to analyze reviews to know about the features of the product that they want to buy.Integrating many features into a single product provides more attractive which makes the customer to buy that product, after worked with the high-feature product; the customer may get dissatisfied whicheventually reduces the manufacturers Customer Equity (CE). Thus, it isnecessary to analyze the usability of the product.The existing usability evaluation methodshave some limitations in determining which features must be integrated into the product in order to remove the unnecessary feature.In this paper, a novel approach is proposed to help designers to find an optimal feature that providesthe decision supports for product designers to enhance the product usability in the future.The most updated customer reviews on product usability are collected from web. Latent Dirichlet Allocationis used for extracting the product featureby stemming process with the integration of Hidden Markov Model. The k- Optimal Rule Discovery technique with Adaptive Invasive Weed Optimization algorithm is adopted to obtain theoptimal customer opinions on the usability of product features. Finally, hybrid Firework Optimization method with differential evolution is adopted for feature fatigue analysis based on the usability.Based on the analyzed feature, Feature fatigue is alleviated efficiently. The proposed approaches are experimented and result shows that proposed work achieves 97 % accuracy which is higher than existing work.
    Keywords: Feature Fatigue; Latent Dirichlet Allocation; Hybrid Firework Optimization; Differential Evolution.
    DOI: 10.1504/IJCAET.2019.10009148
  • Investigation of Methodical Framework for Cross-Platform Mobile Application Development: Significance of Codename One   Order a copy of this article
    by Munir Kolapo Yahya-Imam, Sellappan Palaniappan, Seyed Mohammadreza Ghadiri 
    Abstract: Mobile application development landscape is changing very rapidly with developers moving from traditional approach to write once, run anywhere. In any case, most mobile applications have comparable performance such as tight project schedules, budget, and the need to bolster both Android and iOS. For most developers, particularly those that are migrating from web to mobile applications, cross-platform mobile application tools are often preferred as new development tools that promise some native-like functionalities and performance. However, they often ask questions like which cross-platform development tool to choose?, and which framework is easier, better and supports our requirements?. This paper presents answers to these questions by evaluating some popular cross-platform mobile applications development tools. Towards the end, this paper recommends Codename One for cross-platform mobile applications developers because of its unique strengths and significance.
    Keywords: Cross-Platform; Codename One; PhoneGap; Xamarin; SenchaTouch; Titanium; Mobile App Development.

  • Performance Comparison of SDN OpenFlow Controllers   Order a copy of this article
    by Vishnupriya Achuthan, Radhika N 
    Abstract: Software Defined Networking (SDN) is the centralized network management technology that could reduce the network administration and policy enforcement overhead in the traditional IP networking. SDN controller is the network operating system responsible for entire network operations. However, there are many open source controllers available, such as NOX, POX, FloodLight, and Open Daylight. Each controller has its own properties that could support specific requirements. In this paper, we have compared the performance of most familiar Openflow controllers like NOX,POX,Ryu,FloodLight and OpenFlow reference controller based on their packet handling capacity, by varying the packet size, number of packets and arrival pattern in the IP traffic flows. Distributed Internet Traffic flow Generator (D-ITG) tool has been used to measure the performance in terms of delay, jitter, throughput and packet loss. Our experimentation results show that, FloodLight has the better throughput and less delay when compared to other controllers. This work substantiates the researcher in choosing the appropriate controller for their requirements.
    Keywords: Software Defined Networking; SDN Controllers; Traffic Generation; QoS parameters.

  • OntoCommerce: An Ontology Focused Semantic Framework for Personalized Product Recommendation for User Targeted E-Commerce   Order a copy of this article
    by Gerard Deepak, Dheera Kasaraneni 
    Abstract: In recent times, with the increase in the number of users of the Internet and the World Wide Web, there is a paradigm shift in business strategy in terms of online marketing and e-commerce. Several e-commerce websites serve as a perfect platform for integrating the products and users with the goal of selling the product. Although many e-commerce websites are available, the recommendation of the relevant products to the users can always be improved. With the World Wide Web transforming into a more intelligent Semantic Web, there is a perpetual need for semantically driven e-commerce system which recommends products as per the user preferences. In this paper, OntoCommerce which is an e-commerce system that incorporates semantically driven algorithms for product recommendation with personalization has been proposed. Also, a new variant of Normalized Pointwise Mutual Information called as the Enriched Normalized Pointwise Mutual Information strategy for semantic similarity computation has been proposed. OntoCommerce is an e-commerce system which incorporates ontologies and recommends products based on the user query, recorded user navigations as well as the user profile analysis. In order to make the recommendations more relevant and lower the false discovery rate, OntoCommerce uses fuzzification of certain parameters to enhance the number of recommendable products. OntoCommerce yields an average accuracy of 88.68 % with a very low false discovery rate of 0.13 which makes it a best-in-class semantically driven product recommendation system.
    Keywords: E-Commerce; Ontologies; Personalized Recommendation System; Semantic Similarity; Enriched Normalized Pointwise Mutual Information.

  • Road Segmentation and Tracking on Indian Road Scenes   Order a copy of this article
    by VIPUL MISTRY, Ramji Makwana 
    Abstract: Vision based road detection is a challenging task due to surrounding scenes and types of roads. This paper describes an efficient and effective algorithm for general road segmentation and tracking. The major contributions of this paper are two aspects: 1) An optimized voter selection strategy based modified voting process for vanishing point detection. 2) Use of kalman filter to avoid false detection of vanishing point and reduction in computational complexity of final road segmentation. 3) The algorithm is evaluated against different road types with varying surrounding scenes. The method has been implemented and tested with 10000 video frames of Indian Road scenes. Experimental results demonstrate that the algorithm achieves better efficiency compared to some of the texture-based vanishing point detection algorithms and successfully segments drivable road regions from varying Indian Road scenes.
    Keywords: Vanishing Point; Kalman Filter; Road segmentation; Drivable road detection.
    DOI: 10.1504/IJCAET.2019.10010547
  • Advanced Prediction of Learner's Profile based on Felder Silverman Learning Styles using Web Usage Mining approach and Fuzzy C-Means Algorithm   Order a copy of this article
    by Youssouf EL ALLIOUI 
    Abstract: Problem Statement: One of the biggest problems concerning e-learning is how to predict the learners profile in order to well personalize the process of e-learning. The lack of information about the learner makes it very complicated for a learning environment to provide information and to identify the starting difficulty of the content, which leads to a decrease in the efficiency of the e-learning process. Research Questions: Which model of identifying the starting difficulty of the content will be accurate and global in order to gain a stabilized prediction for different learners? Purpose of the Study: Developing an automatic, optimal and universal prediction model to identify the starting difficulty of the content of the e-learning process and testing the accuracy of the model for different levels of learners. Research Methods: The learning behavior is captured using the Web Usage Mining (WUM) technique. The captured data is then converted into a standard learning style model. The work is mainly focused on the identification of learning styles. The captured data is preprocessed and converted into the XML format based on sequences of accessing contents on the portal. These sequences are mapped to the eight categories of Felder Silverman Learning Style Model (FSLSM) using Fuzzy C-Means (FCM) algorithm. A Gravitational Search Based Back Propagation Neural Network (GSBPNN) algorithm is used for the prediction of learning styles of a new learner. In this algorithm, the Neural Network approach is modified by calculating the weights using Gravitational Search Algorithm (GSA). The accuracy of the prediction model is compared with the Basic Back Propagation Neural Network (BPNN) algorithm. Findings: The accuracy of the prediction model is compared with the Basic Back Propagation Neural Network (BPNN) algorithm.
    Keywords: Learner's Profile; Felder Silverman Learning Styles; Web Usage Mining; Fuzzy C-Means Algorithm.

    by Sivakumaran AR, Marikkannu P 
    Abstract: Web Mining (WM) is the programmed disclosure of client access design from web servers. Universities gather vast volumes of data in their day by day operations, produced naturally by web servers and gathered in server access logs. The research shows the Forecasting and Enhancing Universities Navigation from Web Log Data (F&EUN-WLD). In the primary stage F&EUN-WLD concentrates on isolating the potential clients in web log data (WLD). Trial comes about speak about methodology can enhance the quality of grouping for client route design in web utilization mining frameworks. These outcomes can be utilized for foreseeing client's next solicitation in the tremendous web destinations.
    Keywords: Web Mining; Web Log Data; route design; Navigation; web utilization.

  • A safety system for school children using GRAG   Order a copy of this article
    by Joe Louis Paul Ignatius, Sasirekha Selvakumar 
    Abstract: Millions of children need to commute between home and school every day. Safer transportation of school children has been a critical issue as it is often observed that, children may prone to abductions, accidents and etc. Parents are really worried about their childrens safety. Hence many systems using Radio Frequency Identification (RFID) and Global Positioning System (GPS) were built. This work is intended to provide yet another solution to these problems by integrating both the technologies with Global System for Mobile Communication (GSM) to provide an efficient system called GRAG (GPS, RFID And GSM). RFID will monitor the entry and the exit of children into and out of the school respectively. Messages will be sent to parents regarding the same using Short Message Service (SMS). In case, the child doesnt reach the school or home within the time period, parents can call the registered number placed in the GPS tracker to activate it. Then, the GPS tracker will track the position, the latitude and longitude coordinates of the child are sent to the parents mobile. These coordinates can be entered into the Android application to find the exact location.
    Keywords: Radio Frequency Identification (RFID); Global System for Mobile Communication (GSM); GPS tracker; Arduino UNO; Parallax Data Acquisition (PLX-DAQ); Android application.

  • An Android based Hardware System for Accident Avoidance and Detection on Sharp Turns   Order a copy of this article
    by Shilpa Mahajan, Nikhat Ikram 
    Abstract: Accidents are the major cause of destruction of human lives as it is the most uninvited and unintentional happening that causes a lot of damage, injury as well as they can cause loss of human life. Road accidents have been major reason for loss of human lives .One of the reason for increase of road accidents is the growing prosperity of the world, which has resulted in increase in vehicles on roads, this in turn increase the traffic density, travelling distance and the time spend in travelling, thus increasing the chances of vehicle collision. Seeking the current scenario of increasing road accidents, a working hardware prototype has been proposed along with android based application to avoid road accident. This paper represents the scenario where collision of vehicles is avoided by alerting the driver with a buzzer that will buzz whenever there is another car with ZigBee, in range with the current car and a message will be flashing on LCD inside the car that vehicle is detected nearby. Communication between the cars will take place via ZigBee. Position of the approaching vehicles can be seen on an android application installed on a mobile device that aims to depict the exact location of the vehicles on Google map. This will help to reduce the collision. This whole proposed method is different from other methods as the methods that have been invented till date are mostly in-built techniques while the method proposed here is a prototype that can be implemented in real world. In this paper
    Keywords: WSN;Accidents;VANET.

  • Computer Aided Software Integrated Automated Safety System   Order a copy of this article
    by SOWJANYA Pentakota 
    Abstract: Abstract: Software for any system must deal with the hazards identified by safety analysis in order to make the system safe. Building a safety software requires special procedures to be used in all phases of the software development process. In this work we have dealt with Safety analysis techniques such as Failure Modes and Effects Analysis (FMEA) and fault tree analysis (FTA) based safety-critical approach towards to development of an integrated automotive system from a safety perspective. A proposal of Software Safety Architecture and Software Safety Lifecycle has developed here using some important Safety techniques.rnA new Software Development Lifecycle with an integration approach i.e. Agile-V model is proposed. Driver Assistance System like ACCS is a vehicle automotive system which is helpful to prevent accidents by reducing the workload on the driver. The basic design and functionality of ACCS is done with the Safety command of bypassing to braking system when needed. As a Safety approach for some limitations we have introduced an integrated architecture using Fuzzy Logic which has less failure cases and improves efficiency. The basic design and functionality of braking system is done with ABS and without ABS so that stopping distance also decreases. rn
    Keywords: rnKeywords: Adaptive Cruise Control System (ACCS); Anti-Lock Braking System (ABS); FMEA; FTA; Software Safety Architecture (SSA) and Software Safety Lifecycle (SSL).rn.

Special Issue on: ICMCE-2015 Advances in Applied Mathematics

    by Subramani Rajamanickam, Vijayalakshmi C 
    Abstract: Abstract - This paper mainly deals with the development of an energy management model using a SCADA (Supervisory Control and Data Acquisition) system. A predictive controller is implemented above the centralized SCADA platform. The distribution networks have been focused by a Monitor, Control and maintain equipment in the sub stations to reduce the operating cost. This research proposes a new energy management model that enables a flexible and also efficient operation of various power plants. The Distribution Control Centre (DCC) is being monitored and controlled by SCADA systems and the DCC has become an important energy efficient policy concept. Based on the numerical calculations and graphical representations the renewable energy sources in both configurations, is independent of the enduring or intermittent main energy resource availability, which can lead to effective production.
    Keywords: Distribution Automation; DCC; SCADA; Transmission Capacity; Demand-Side Management; Lagrangian Relaxation (LR).

    by Karunamurthy Krishnasamy, Chandrasekar M, Manimaran R 
    Abstract: In this paper, artificial neural networks (ANNs) model was used to predict the performance parameters of a laboratory model salinity gradient solar pond (SGSP), which is used for supplying hot water. Experiments were conducted on three different solar ponds provided with and without twisted tapes in the flow passage of the in-pond heat exchanger during the month of MAY 2015 at Chennai weather conditions in India. The performance parameters of solar pond such as outlet water temperature, efficiency of solar pond and effectiveness of in-pond heat exchanger were determined experimentally for two different flow rates of Reynolds numbers 1746 & 8729. The experimental data obtained from the observations were utilised for training, validating and testing the proposed artificial neural network model. The parameters like incident solar radiation, inlet water temperature, lower convective zone (LCZ) temperature and flow rate are responsible for the outlet water temperature of the solar pond. Based on the experimental readings as inputs a computational program was developed in Python. This program was trained using artificial neural network with back propagation algorithm to predict the outlet water temperature of the in-pond heat exchanger. The results predicted using the model developed is in good agreement with the experimental results.
    Keywords: Solar Pond; Performance Parameters; Artificial Neural Network; Twisted Tapes.

  • Logistic Regression Model as Classifier for Early Detection of Gestational Diabetes Mellitus   Order a copy of this article
    Abstract: Gestational Diabetes Mellitus (GDM) is any degree of glucose intolerance during pregnancy. In view of maternal morbidity and mortality as well as fetal complications, early diagnosis is an utmost necessity one in the present scenario. In developing country like India, early detection and prevention will be more cost effective. Oral Glucose Tolerance Test (OGTT) is the crucial method for diagnosing GDM done usually between 24th and 28th week of pregnancy. The proposed work focuses on early detection of GDM without a visit to the hospital for women who are pregnant for the second time onwards (multigravida patients). In recent years, prediction models using multivariate logistic regression analysis have been developed in many areas of health care research. With an accuracy of 82.45%, the classifier has proved to be an efficient model for diagnosis of GDM without the conventional method of blood test by providing newly designed parameters as inputs to the model.
    Keywords: Gestational Diabetes Mellitus; Diagnosis; Logistic Regression; Risk Factors.
    DOI: 10.1504/IJCAET.2019.10011940
  • Information Hiding using LSB Replacement Technique and Adaptive Image Fusion   Order a copy of this article
    by Lakshmi Priya S, Namitha K, Neela Niranjani V, Manoj Kumar Natha 
    Abstract: Steganography is a branch of information hiding which allows people to communicate in a secure way. As more information is transferred electronically, the need for confidentiality of this information increases. Our paper combines two techniques (1) the LSB replacement technique for hiding text messages in an image and (2) an iterative image fusion algorithm. This algorithm uses the calculated fusion parameter to select and fuse an optimal carrier image with the input image containing the hidden text. Both of these images belong to the same class and this classification is done based on the values of three coefficients brightness, texture and variation which are obtained from Haar wavelet transform. This additional step of fusing the images increases the PSNR value of the final image which in turn enhances security during transmission. The results show how PSNR value increases for the final image and also a comparative analysis is done in three ways (1) PSNR values before and after fusion (2) effects of using an optimum carrier versus a random carrier image and (3) effects of varying lengths of input text messages and the corresponding changes in PSNR values. From the three cases, we are able to prove that the PSNR value of the final image is increased after fusion with an optimal carrier image.
    Keywords: Steganography; Adaptive Image Fusion; LSB Replacement.

Special Issue on: Ubiquitous Sustainable Systems

  • Gender Classification using PSO based Feature Selection and Optimized BPNN in Forensic Anthropology   Order a copy of this article
    by Nurul Liyana Hairuddin, Lizawati Mi Yusuf, Mohd Shahizan Othman, Dewi Nasien 
    Abstract: The development of biological profile allows gender classification which is a crucial task in most forensic cases. A biological profile is developed by the anthropologists to assist in the identification process of an individual. In most forensic anthropology cases, skeleton remains are employed in the process of producing a biological profile. There are different parts of human skeleton available for the process of gender classification. Every part of skeleton contains different types of features and this benefits toward gender classification. However, not all the features can contribute toward gender classification because some features do not carry any information on gender and some data may have redundant information. Hence, this article proposed Particle Swarm Optimization (PSO) based feature selection and optimized BPNN model as a gender classification framework. Initially, PSO functions to select the most significant features that lead to an accurate classification process. While BPNN process, the parameter tuning based on cross-validation technique is applied where the model able to find a good combination of learning rate and momentum. The main scope of this article is to develop a framework that able to conduct a proper feature selection process and parameter optimization for accurate gender classification in forensic anthropology field. This article utilized three different sets of data which are Goldman Osteometric dataset, Clavicle collection, and George Murray Black collection. The result shows that the accuracy of gender classification is improved for every dataset via the proposed framework.
    Keywords: Gender classification; forensic anthropology; feature selection; particlernswarm optimization; backpropagation neural network; parameter tuning.