International Journal of Computational Systems Engineering (31 papers in press)
Need for RADAR System Utilization for Maritime Traffic Management: A case of Congo River Basin
by Habib Ullah Khan, Oduniyi Ayotunde Adediji
Abstract: Maritime traffic management has emerged as a new challenge along with the developments going on in the world. It has always been a task to maintain the productivity in the ports along with the maintenance of safety and security. The present study concentrated on such measures for the maritime traffic management on the Congo River Basin with the help of RADAR technology. It aimed to know the intensity of the issue as well as necessity for such system in lieu of old procedures to meet the growing mishaps and traffic. The study collected the opinion of the personnel of maritime department as primary data and the secondary data is collected from the records of CICOS and SCTP departments. The data is analysed using the analysis of variance technique to know whether there is increase in the untoward incidents and the traffic among the three countries connected to this basin. The results of the analysis showed the role of human errors in the occurrence of the mishaps and the personnel also opined to install the RADAR system to control this. The results of ANOVA are tested for the p-value at 5% level of significance and showed a significant increase among the accidents and deaths on the Congo River Basin from the years 2008 to 2012. The data of the traffic of passengers and the goods for the years 2010 to 2012 also showed similar trend highlighting the necessity for the efficient measures by employing strict rules and by installing new systems with the help of RADAR technology.
Keywords: Democratic Republic of Congo (DRC); Central Africa Republic (CAR); Congo Republic (RC); International Maritime Organization (IMO); Commission Internationale du Bassin Congo-Oubangui-Sangha (CICOS); Societe Commerciale des Transports et des ports (SCTP).
Finding the Best Bug Fixing Rate and Bug Fixing Time Using Software Reliability Modelling
by Rama Rao
Abstract: This article is mainly focused on finding the best possible way to rectify Bug Fixing Rate (BFR) and Bug Fixing Time (BFT). Further, versatile software projects have been verified when materializing the bug fixing rate. To increase the bug fixing rate, bug traceability is reduced by virtue of version tag in each and every component of a software deliverable. Software build release time is optimized by using mathematical optimization techniques such as software reliability growth and non-homogeneous poisson process models. This is very much essential in present market scenario. The build inconsistency and automation are also rectified in this erudite research work. The developed software is free from defects and improves the software quality by increasing bug fixing rate.
Keywords: Bug Fixing Rate; Bug Fixing Time; Bug Traceability Time; Software Build Automation; Software Reliability; Version Tag; Software Risk and Version Control System.
Evolutionary Optimisation to Minimise Material Waste in Construction
by Andy Connor, Wilson Siringoringo
Abstract: This paper describes the development and evaluation of a range of metaheuristic search algorithms applied to the optimal design of two-dimensional layout problems, with the particular application on residential building construction. Results are presented to allow the performance of the different algorithms to be compared in the pareto-optimal solution space, with resulting solutions identified and analysed in the objective space. These results show that all of the algorithms investigated have the potential to be applied to optimise material layout and improve the design processes used during building construction.
Keywords: Metaheuristic algorithms; Evolutionary computation; Layout optimisation; Residential construction.
Study and Analytical Perspective on Big Data
by Yashika Goyal, Yuvraj Monga, Mohit Mittal
Abstract: From past era, a great advancement has been envisioned in the technological world. It is exponentially expanding in field of virtualization. Due to this, every field comes under digitalization. Wired as well as wireless communication is completely working on digitalized form. Conclusively, expansion of huge amount of data has been seen. To manage these large chunks of information scientists and research are focused on Big Data. Big data has capability for paradigm shift of the prevalent IT services. In this paper, we will focus on terminology of big data, applications and various tools used to manage big data
Keywords: Data; Big Data Analytics; Data Mining; Big Data applications; Big Data tool; Future trends.
A SURVEY OF MACHINE LEARNING TECHNIQUES
by Devi I, Karpagam G R, Vinothkumar B
Abstract: Artificial Intelligence (AI) allows the systems to observe from environments, perform certain functionalities and aims to maximize the probability of success in solving real world problems. With the technological enhancements and scientific growth, AI turns out to be a field of interest. Thus, it leads to the amplified focus on Machine Learning (ML) techniques. Machine learning (ML) is the most important data analysis methods which iteratively learn from the available data by using learning algorithms. The present survey provides the theoretical representation and basic methodologies of machine learning techniques like Support Vector Machine (SVM), K-Nearest Neighbors (KNN), Decision tree, Bayesian networks, Clustering, Hidden Markov Model (HMM) and Neural networks. This survey paper provides the influence of machine learning techniques like clustering, SVM and ANN on image compression and attention to the existing scope for the image compression with machine learning.
Keywords: Machine Learning; supervised learning; unsupervised learning; support vector machine; artificial neural networks and Image Compression.
Financial Crisis and Technical Progress 2008-2016. The Parallel Shift in Information Technology
by Marius Balas
Abstract: The paper discuss the paradoxical relationship between the 2008 global financial crisis, which triggered a widespread economic recession, and the technical progress rate, which blossomed the following years. We explain this phenomenon by the significant decreasing of the financial leverage caused by the crisis. The automotive industry, with the launching of the fully electric car is illustrating the thesis. In IT the crisis triggered a shift towards the parallel computing. Outstanding achievements in avionics, biomedical/health care equipment, or dataflow computing reveal a strategic superiority of parallel systems (FPGA/ ASIC) over the conventional CPU bus oriented computing devices (von Neumann architectures: microcontrollers, DSPs, etc.) in most respects: speed, energy consumption, size, weight or reliability. Still the parallel shift implies new paradigms, costs and efforts and the mainstream of the IT establishment has been reluctant so far to the parallel shift. Last events show a new attitude and interest for parallel architectures.
Keywords: financial crisis; financial leverage; fully electric car; von Neumann architecture; gate array; parallel computing.
A Comparative Evaluation of Microwave Antenna Designs performance on Digital MR Image in Hyperthermia System
by Minu Sethi
Abstract: Various hyperthermia techniques have addressed a wide variety of clinical and technical issues. Still there is a scope for improvement in them particularly in surrounding region of tumor where healthy tissues are affected by heat. An efficient hyperthermia system is presented to evaluate and compare the performance of rectangular E-shape microwave antenna with two different circular shape antennas on digital MR (magnetic resonance) Image of brain tumor that caters all the limitations. The antennas are designed on IE3D software for centre frequency of 2.45 GHz which is most commonly used microwave frequency in hyperthermia system defined for standard medical applications. A Program is developed using MATLAB that takes raw data of return loss from different antennas designed in IE3D software and produces an analog input signal. Thermal Imaging is done by applying the EM (electromagnetic) signal on the ROI (region of interest) in MR (medical resonance) Image of brain tumor. Some of the features of the program are: (1) Rectangular binary masking is done on MR Image of brain tumor to create ROI. (2) Thin plate with triangular element is meshed on the original tumor image. (3) PDE (partial differential equation) is solved to define temperature in a thin plate. The system combines the data processing and analysis. It is simple and easy to use in order to obtain both quantitative values and images of temperature changes. For real time analysis the antenna designs are fabricated and tested. Heating up to 45 Degree Centigrade with minimal damage to surrounding healthy tissues is the prime focus of research. The evaluated results illustrate the potential of designed antennas for hyperthermia treatment applicators.
Keywords: microwave signal; EM (electromagnetic); MR (Magnetic Resonance) Image; hyperthermia; tumor; micro strip patch; PDE (Partial differential equation); ROI (region of interest); return loss.
An Empirical Evaluation of Memory Less and Memory Using Meta-Heuristics for solving Travelling Salesman Problem
by Arun Prakash Agrawal, Arvinder Kaur
Abstract: In many situations, a researcher gets bewildered when it comes to selection of an appropriate metaheuristic algorithm for any specific problem. Metaheuristic algorithms need to be categorized depending on their ability to solve problems of varying degree of complexity to overpower such confusion. Considering this view, the performance evaluations of six popular metaheuristic algorithms have been done. Three research questions are framed to evaluate the hypothesis for any difference in the performance of memory less and memory based metaheuristics. Domain of inquiry in this paper is travelling salesman problem. Extensive experiments are conducted and results are analyzed using various statistical tests such as F-test, and Post-hoc tests. An obvious outcome of this study is that there is an interaction effect between the problem sizes and the metaheuristic used and no clear superiority of one metaheuristic over the other.
Keywords: Optimization problems; meta-heuristics; Travelling Salesman Problem.
Cross Layer based Modified Virtual Backoff Algorithm for Wireless Sensor Networks
by Ramesh Babu Palamakula, P. Venkata Krishna
Abstract: Since decade and more, there is a tremendous improvement in the usage of wireless sensor networks in various applications. The network operation is guaranteed to be effective and fruitful with efficient MAC protocol. Hence, this paper proposes the Cross Layer based Virtual Backoff Algorithm (CLM-VBA). The cross layer architecture is designed in this paper which involves three layers: physical layer, MAC layer and Network layer. The priority is given to the neighbouring nodes which are determined using the cross layer architecture. Two different counters are maintained to keep track of number of accesses and number of attempts along with sequence number. The delay sensitive applications are given preference when compared to the delay insensitive applications. Sleep mode is used for each node in order to conserve energy at each node. A buffer is maintained at each node in order to improve the performance of the system. The proposed algorithm CLM-VBA is compared with VBA, S-MAC and M-VBA in terms of delay, packet delivery ratio, energy consumption and number of collisions and proved to be better.
Keywords: backoff; channel access; cross layer; MAC; collision.
Special Issue on: New Challenges in Intelligent Computing and Applications
Dynamic Priority based Packet Handling protocol for Healthcare Wireless Body Area Network system
by Sapna Gambhir, Madhumita Kathuria
Abstract: The vision of Wireless Body Area Network (WBAN) is to facilitate, improve, and have an immense impact on healthcare system in terms of identifying the risk level or severity factor of a patient in various emergencies. Modern and technical advances in WBAN revolutionize this area for autonomous monitoring of vital signals for a longer duration as well as from a remote place. However, handling of heterogeneous packets in a fast changing healthcare scenario has continued to be an opportunity for exploration. We present a novel concept of Dynamic Priority based Packet Handling (DPPH) which promises to add exciting capabilities to the world of WBANs. DPPH uses the principles of accurate identification and classification of heterogeneous packets to effectively determine patients critical condition and alerts the medical server. In this paper, we have focused on dynamic prioritization based queuing, scheduling, resource allocating and alerting policies for performance enhancement. The proposed approach is validated through a comparison with the existing approach. The performance of the proposed protocol is implemented using a network simulator NS-2.35 and is judged on the basis of packet delivery ratio, loss ratio, end-to-end delay, and throughput, with variation in nodes.
Keywords: Alert; Abnormal condition; Weighted Deviation; Detection; Prioritization;Packet Handling;Vital signal; Wireless Body Area Network.
Development Of Software Effort Estimation Using a Non Fuzzy Model.
by H. Parthasarath Patra, Kumar Rajnish
Abstract: Nowadays accurate estimation of the software effort is a challenging issue for the modern software developers. To bind a contract depends purely on the estimated cost of the software. Over estimate or under estimate lead a loss or gain of the software project and also increase the probability of success and failure of the project. In this paper we have used a non fuzzy conditional algorithm to build a suitable model to estimate the software effort taking NASA software projects data. We have planned to set of linear conditional models using the domain of possible KLOC (Kilo Lines of Code). The performance of developed model has been analyzed using NASA data set  and we compare with the result of COCOMO tuned-PSO, Halstead, Walston-Felix, Bailey-Basili and Doty models were provided.
Keywords: Lines of code; Software cost estimation; MRE; MMRE; PRED.
HiRSA: Computing Hit Ratio for SOA applications through Tcases
by Arpita Dutta, Haripriya Kunsothh, Sangharatna Godboley, Durga Prasad Mohapatra
Abstract: In this article, we propose a novel method for black-box test case generation for Business Process Execution Language (BPEL) processes. We also propose a method to compute Hit Ratio percentage metric. We also compute the total time taken for test case generation and the speed of test case generation. We design the Service-Oriented Architecture (SOA) based applications using OpenESB tool. We have developed a code converter to generate Tcases compatible .xml code because the OpenESB generated .xml code is incompatible with Tcases input framework. After that, we compute the Hit Ratio percentage with the help of Hit Ratio Calculator. We have experimented with twelve SOA based applications, on an average, we achieve 62.78% of Hit Ratio with an average time of 873.08 ms and speed of 32.41 approx. 32 test cases per second.
Keywords: Service-Oriented Architecture; SOA; Hit Ratio; Black-box testing; Tcases.
Special Issue on: Biomedical Signal and Imaging Trends and Artificial Intelligence Developments
Application of Ensemble Artificial Neural Network for the Classification of White Blood Cells using Microscopic Blood Images
by Jyoti Rawat, Annapurna Singh, Harvendra singh Bhadauria, Jitendra Virmani, Jagrtar Singh Devgun
Abstract: Introduction: This work exhibits an application of ensemble artificial neural network for the white blood cell classification. The incentive for experimenting with the Artificial neural network (ANN) based computer aided classification (CAC) design is that these designs based on ensemble methods are expected to yield a better outcome in comparison to the outcome achieved by CAC system designs based on single multiclass classifier. In recent times, digital image processing technique has been widely utilized as a part of health diagnosis. In order to overcome the problems of manual diagnosis in recognizing the morphology of blood cells, the automated analysis is frequently utilized by a pathologist. In the pathology lab, white blood cells are analyzed by an expert that is a suspicious and subjective task. Remembering the end goal to enhance the precision, an automatic white blood cell classification framework is crucial for helping the pathologists in diagnosing various haematological disorders like leukemia or lymphoma. This work gives a semi-automated technique to identify and classify white blood cell.
Method: In this work, a k-means clustering algorithm is used to segment the nucleus by upgrading the district of the white blood cell nucleus and stifling the other components of the blood smear images. From each cell image, different features, such as shape, chromatic and texture, are extracted. This feature set was used to train the classifier in order to determine different classes of white cell.
Results: Performing this evaluation, classification models allowed us to establish that CAC system design based on the ensemble artificial neural network is the most suitable model for the four class white cell classification, with an accuracy of 95 %.
Conclusions: The proposed method analyzes the blood cells automatically via image processing techniques, and it represents a medicinal method to avoid the plentiful drawbacks associated with the labour-intensive examination of white cells.
Keywords: White blood cell; Segmentation; k-means clustering; Texture features; Shape features; Chromatic features; Artificial neural network classifier
Comparative studies of Discrete Cosine transform (DCT) and Lifting Wavelet Transform (LWT) techniques for compression of Blood Pressure Signal in Salt Sensitive Dahl Rat
by Vibha Aggarwal, Manjeet Singh Patterh, Virinder Kumar Singla
Abstract: This paper introduces a study based on quality controlled Discrete Cosine transform (DCT) and Lifting Wavelet Transform (LWT) based compression method for Blood Pressure Signal Compression in Salt Sensitive Dahl Rat. The transformed coefficients are thresholded using the bisection algorithm to match the predefined user specified percentage root mean square difference (PRD) within the tolerance. Then, the binary lookup table is made to store the position map for zero and non-zero coefficients (NZC). The NZC are quantized by Max-Lloyd quantizer followed by Arithmetic coding. Lookup table is encoded by Huffman coding. The results are presented on different Blood Pressure signals of varying characteristic. There is no significant difference in before quantization PRD (BPRD) and after quantization PRD (QPRD) in various signals in both transforms. Mean compression ratio increases with an increase in user define PRD (UPRD).
Keywords: Blood Pressure signal in Salt Sensitive Dahl Rat; Compression; Nonlinear transform; Linear transform.
B-mode breast ultrasound image segmentation techniques: an investigation and comparative analysis
by Madan Lal, Lakhwinder Kaur, Savita Gupta
Abstract: Breast cancer is the second leading reason for death among women. A commonly used method for detection of breast cancer is ultrasound imaging. Ultrasonic imaging is a low cost, easy to use, non-invasive and portable process, but it suffers from acoustic interferences (speckle noise) and other artifacts, As a result, it becomes difficult for the experts to directly identify the exact shapes of abnormalities in these images. Numerous techniques have been proposed by different researchers for visual enhancement and for segmentation of lesion regions in Breast Ultrasound images. In this work, different automatic and semi-automatic Breast Ultrasound image segmentation techniques have been reviewed with a brief explanation of their different technological aspects. Performance of selected methods has been evaluated on a database of 45 B-mode Breast Ultrasound images containing benign and malignant tumors (25 benign and 20 malignant). For performance analysis of the segmentation methods, manually delineated images by an expert radiologist are used as ground truth images whereas boundary and area error metrics are used for comparison of quantitative results.
Keywords: B-Mode Breast Ultrasound (BUS) Image; Speckle Noise; Thresholding; Region Growing; Fuzzy Clustering; Watershed; Active Contour; Level Set.
An improved unsupervised mapping technique using AMSOM for neurodegenerative disease detection
by ISHA SUWALKA, Navneet Agrawal
Abstract: The most challenging aspect in medical imaging is the accuracy of detection of neurodegenerative diseases .The advent of new imaging techniques has yet limited manual evaluations, manual reorientation and other time consuming limitations with reduced resolution . Therefore, there is a need to develop efficient algorithm for proper detection with quantitative information of significance for the clinicians. The proposed algorithm includes improved adaptive moving self organizing mapping (AMSOM) which trains the extracted features along with Mini-Mental State Examination (MMSE) factor and volumetric parameter using Volume based method (VBM) for computing feature data set which in total improves time iteration rate ,mean square error ,sensitivity and accuracy. The algorithm is improved version of moving mapping method which on one hand tackles drawback of SOM of fixed grid mapping and improves neighbourhood function of neuron which provides better detection and classification yielding promising results. It further improves performance of AMSOM by better visualization of the input dataset and provides a framework for determining the optimal number and structure of neurons. This paper uses real MRI dataset taken from OASIS having a cross-sectional collection of 416 subjects aged 18 to 96. The analysis includes different comparison of mapping approaches that reveals features associated to the Alzheimer Disease.
Keywords: self organizing mapping for MRI image; hierarchical mapping with GHSOM; e- database using OASIS ;moving neuron concept using AMSOM;clustering for detection of Alzheimer Disease.
Active Contours using Global Models for Medical Image Segmentation
by Ramgopal Kashyap, Vivek Tiwari
Abstract: Accurate segmentation with denoising are subject of research in the field of medical imaging and computer vision. This paper presents an enhanced energy based active contour model with a level set detailing. Local energy fitting term impacts neighborhood drive to pull the shape and restrict it to protest limits. Thus, the global intensity fitting term drives the movement of contour at distant from the object boundaries. The global energy term depends on worldwide division calculation, which can better catch energy data of picture than Chan-Vese (CV) model. Both neighborhood and worldwide terms are commonly absorbed to build a vitality work in light of a level set plan to portion images with force inhomogeneity, Experiments demonstrate that the proposed model has the upside of commotion resistance and is better than conventional image segmentation. Results demonstrate that the proposed method performs better both subjectively and quantitatively contrasted with other best in class methods.
Keywords: Denoising;Energy based active contour;Image segmentation; Intensity inhomogeneity; Local binary fitting; Local region based active contour.
A Computerized Framework for prediction of fatty and Dense Breast Tissue Using Principal Component Analysis and Multi-resolution Texture Descriptors
by Indrajeet Kumar, Harvendra Singh Bhadauria, Jitendra Virmani
Abstract: The present work proposes a computerized framework for prediction of fatty and dense breast tissue using principal component analysis and multi-resolution texture descriptors. For this study 480 MLO view digitized screen film mammograms have been taken from the DDSM dataset. A fixed ROIs size of 128
Keywords: Mammography; Breast density classification; Multi-resolution texture descriptors; principal component analysis; Support vector machine (SVM) classifier.
GPU-based Focus-Driven Multi-coordinates Viewing System for Large Volume Data Visualization
by PIYUSH KUMAR, ANUPAM AGRAWAL
Abstract: The advancements in biomedical scanning modalities such as Computed Tomography (CT), Positron Emission Tomography (PET) and Magnetic Resonance Imaging (MRI) are improving in their resolution day by day. The newly physicians may face some problem rely on exploring 2D slices and diagnosing with 3D full humans anatomy structure at the same time. In this paper, we are presenting a generalized contactless interactive Graphics Processing Unit (GPU) accelerated Compute Unified Device Architecture (CUDA) based focus and context visualization approach with displaying the inner anatomy of the large scale visible human male dataset in Multi-Coordinate Viewing System (MCVS). The focusing area has been achieved by 3D Cartesian Region of Interest (ROI). The large dataset has been structured by using Octree method. The volume rendering part has been done by using an improved ray intersection cube method for voxels with the ray casting algorithm. The final results would allow the doctors to diagnose and analyze the atlas of 8-bit CT-scan data using three dimensional visualization with the efficient frame rate rendering speed in multi-operations like zooming, rotating, dragging. The system is tested for multiple types of 3D medical datasets ranging from 10 MB to 3.15 GB. Medical practitioners and physicians are able to peer inside of the dataset to use the features of the inner information. This system is further tested with three NVIDIA CUDA enabled GPU cards for the performance analysis. The scope of this system is to explore of the human body for surgery purpose.
Keywords: Volume Visualization; Focus-driven; MCVS; Focus and Context; MRI dataset; Medical dataset;.
Volumetric Tumor Detection Using Improved Region Grow Algorithm
by Shitala Prasad, Shikha Gupta
Abstract: This paper works on segmentation of brain pathological tissues (Tumor, Edema an Narcotic core) and visualize it in 3D for their better physiological understanding. We propose a novel approach which combines threshold and region grow algorithm for tumor detection. In this proposed system, FLAIR and T2 modalities of MRI are used due to their unique ability to detect the high and low contrast lesions with great accuracy. In this approach, first the tumor is segmented from an image which is a combination of FLAIR and T2 image using a threshold value, selected automatically based on the intensity variance of tumor and normal tissues in 3D MR images. Then the tumor part is extracted from the actual 3D MRI of brain by selecting the largest connected volume. To correctly detect tumor 26 connected neighbors are used. The method is evaluated using a publically available BRAT dataset of 80 different patients having Gliomas tumors. The accuracy in terms of detection is reached to 97.5\% which is best compared to other state-of-the-art in given time frame. The algorithm takes 4-5 minutes for generating the 3D visualization for final output.
Keywords: 3D Volumetric; Brain Tumor; Region Growing Algorithm; Thresholding; Volexl Seeding.
Multimodality Medical Image Fusion using Nonsubsampled Rotated Wavelet Transform for Cancer Treatment
by Satishkumar Chavan, Abhijeet Pawar, Sanjay Talbar
Abstract: This paper presents nonsubsampled rotated wavelet transform (NSRWT) based feature extraction approach to multimodality medical image fusion (MMIF). The nonsubsampled rotated wavelet filters are designed to extract textural and edge features. These filters are applied on axial brain images of two modalities namely Computed Tomography (CT) and Magnetic Resonance Imaging (MRI) to extract spectral features. These extracted features are selected using entropy-based fusion rule to form new composite spectral feature plane. Entropy-based fusion rule preserves dominant spectral features and imparts all relevant information from both the modalities to the fused image. The inverse nonsubsampled rotated wavelet transform is applied to reconstruct fused image from the composite spectral slice. The proposed algorithm is evaluated using 39 pilot image slices of 23 patients subjectively and objectively for efficient fusion. Three expert radiologists have verified the subjective quality of fused image to ascertain anatomical structures from source images. Subjective score by radiologists reveals that the fused image using proposed algorithm is superior in terms of visualization of abnormalities over other wavelet based techniques. The objective evaluation of fused images involves estimation of fusion parameters like image quality index (IQI), edge quality measure (EQa,b), mean structural similarity index measure (mSSIM), etc. The proposed algorithm presents better performance metrics over the state of the art wavelet based algorithms.
Keywords: Multimodality Medical Image Fusion; Discrete Wavelet Transform; Rotated Wavelet Filters; Nonsubsampled Rotated Wavelet Transform; Cancer Treatment; Radiotherapy.
Comparison of feature extraction techniques for classification of hardwood species
by Arvind R. Yadav, R.S. Anand, M.L. Dewal, Sangeeta Gupta, Jayendra Kumar
Abstract: The texture of an image plays an important role in identification and classification of images. The hardwood species of an image contains four key elements namely, vessels (popularly known as pores in cross-section view), fibers, parenchymas and rays, useful in its identification and classification. Further, the arrangements of all these elements posses texture rich features. Thus, in this work investigation of existing texture feature extraction techniques for the classification of hardwood species have been done. The texture features are extracted from grayscale images of hardwood species to reduce the computational complexity. Further, linear support vector machine (SVM), radial basis function (RBF) kernel SVM, Random Forest (RF) and Linear discriminant analysis (LDA) have been employed as classifiers to investigate the efficacy of the texture feature extraction techniques. The classification accuracy of the existing texture descriptors has been compared. Further, Principal component analysis (PCA) and minimal-redundancy-maximal-relevance (mRMR) feature selection method is employed to select the best subset of feature vector data. The PCA reduced feature vector data of co-occurrence of adjacent local binary pattern (CoALBP24) texture feature extraction technique has attained maximum classification accuracy of 96.33
Keywords: Texture features; support vector machine; feature selection; hardwood species.
Myoelectric Control of Upper Limb Prostheses using Linear Discriminant Analysis and Multilayer Perceptron Neural Network with Back Propagation Algorithm
by Sachin Negi, Yatindra Kumar, V.M. Mishra
Abstract: Electromyogram (EMG) signals or myoelectric signals (MES) have two prominent areas in the field of biomedical instrumentation. EMG signals are primarily used to analyse the neuromuscular diseases such as myopathy and neuropathy. In addition the EMG signal can be utilized in myoelectric control systems- where the external devices like upper limb prostheses, intelligent wheelchairs, and assistive robots can be controlled by acquiring surface EMG signals. The aim of present work is to obtain classification accuracy first by using linear discriminant analysis (LDA) classifier where principal component analysis (PCA) and uncorrelated linear discriminant analysis (ULDA) feature reduction techniques are used for upper limb prostheses control application. Next the multilayer perceptron (MLP) neural network with back propagation algorithm is used to calculate the classification accuracy for upper limb prostheses control.
Keywords: EMG; MCS; LDA; PCA; ULDA; MLP; Back propagation.
Comparative Study of LVQ and BPN ECG Classifier
by Ashish Nainwal, Yatindra Kumar, Bhola Jha
Abstract: ECG is the electrical waveform og heart activity.It contains much information
on Heart disease. It is very important to diagnosis the heart disease as soon as possible
otherwise it can be harmful to patient. This paper presents to classify ECG signal using
learning vector quantization and Beck propagation neural network and feature of ECG
(morphology and frequency Domain) features.In this paper the 45 ECG signals from MIT-
BIH arrhythmia database are used to clssify in to two classes,one is normal and another
one is abnormal using above mentioned classifier. Out of 45 signals 25 are normal and 20
are abnormal according to MIT-BIH. 28 morphological features and 4 frequency domain
features are set as an input to the classifier. The performance of classifier measures in
the terms of Sensitivity (Se), Positive Predictivity (PP) and Specificity (SP). The system
performance is achieved with 82.35% accuracy using LVQ and 94.11% using BPN.
Keywords: Back Propagation Neural Network; Learning Vector Quantization; ECG;rnMIT-BIH.
Automatic feature extraction of ECG signal based on adaptive window dependent Differential histogram approach and validation with CSE database
by Sucharita Mitra, Madhuchhanda Mitra, Basudev Halder
Abstract: A very simple and novel idea based on Adaptive window dependent Differential histogram approach has been proposed for automatic detection and identification of ECG waves with its characteristic features. To facilitate the estimation of the waves, the normalized signal has been divided into a few small windows by an Adaptive window selection technique. By counting the number of changes between successive samples as frequency, the Differential histogram has been plotted. Some of the zones having an area more than a pre-defined threshold are depicted as QRS zones. The local maxima of these zones are referred as the R-peaks. T and P peaks are also detected. Baseline point and clinically significant time plane features have been computed and validated with reference values of the CSE database. Proposed technique achieved better performance in comparison with CSE groups. Its accuracy is achieved in Sensitivity (99.86%), Positive Productivity (99.76%) and Detection accuracy (99.8%).
Keywords: Adaptive window; Differential histogram; CSE database; baseline; Sensitivity; ECG signal; QRS zones; R-peaks; Distinctive point’s; Sample values.
A Comparative study on Kapurs and Tsallis entropy for multilevel thresholding of MR Images via Particle Swarm Optimization Technique
by Taranjit Kaur, Barjinder Singh Saini, Savita Gupta
Abstract: The present paper explores both the Kapurs and Tsallis entropy for a three level thresholding of Brain MR images. The optimal thresholds are obtained by the maximization of these entropies using a population-based search technique called as Particle swarm optimization (PSO). The algorithm is implemented for the segregation of various tissue constituents, i.e., cerebral spinal fluid (CSF), white matter (WM) and, gray matter (GM) region from the simulated images obtained from the brain web database. The efficacy of the thresholding methods was evaluated by the measure of the spatial overlap i.e. the Dice coefficient (Dice). The experimental results show that 1) For both the WM and CSF the Tsallis entropy outperforms the Kapurs entropy by achieving an average value of 0.967279 and 0.878031 respectively. 2) For the GM, the Kapurs entropy is more beneficial which is duly justified by the mean value of Dice which was 0.851025 for this case.
Keywords: Kapur’s; Tsallis; multilevel thresholding; PSO.
Special Issue on: Data Analysis for Enabling Technological and Computational Enhancement in Design and Optimisation in Various Engineering Domains
Design of PID Controller for Magnetic Leviation System using Modified Gravitational Search Algorithm
by Ankush Rathore, Harish Sharma, Manisha Bhandari
Abstract: Gravitational Search Algorithm (GSA) is a swarm intelligence based algorithmrnwhich is inspired from the law of motion and law of gravity. GSA leads to the lossrnof the exploitation capability. To find a trade-off between exploration and exploitation capabilities of GSA, a modified gravitational search algorithm is proposed namely Exponent Inertia Weight based GSA (EIWGSA). The proposed algorithm maintains a proper balance between the exploitation and exploration skills of GSA by introducing an exponent inertia weight(EIW) parameter. The proposed algorithm is implemented over 15 benchmark functions and compared with basic GSA, BBO and PSO algorithm. Then, the MGSA algorithm is applied to design of PID controller for the magnetic leviation system over a wide difference operating air gap as 3mm, 10mm and 17mm.
Keywords: Gravitational Search Algorithm; Swarm Intelligence; Inertia Weight; Magnetic Leviation System.
CloudCampus: building an ubiquitous Cloud with classroom PCs at an university campus
by Andre Monteiro, Claudio Teixeira, Joaquim Sousa Pinto
Abstract: While Cloud Computing is still a developing paradigm, many of the existing challenges point to new research trends, as resource and power saving. Current datacentres are being used more efficiently, new hardware tries to comply with energy saving and software helps to fulfil these goals. On the other hand, the resource optimization can also be undertaken by maximizing the existing resources, even if not intended for cloud purposes or have state-of-the-art hardware. This paper investigated how to integrate common desktop PCs, with a wide cardinality inside a university campus, on a Cloud infrastructure to lower cost efforts, and how to deliver appropriate services to researchers. We propose a model to categorize applications, show how to build the infrastructure and present performance and consumption results.
Keywords: Distributed applications; resource management; scheduling; performance evaluation.
Fast and Effective Image Retrieval using Color and Texture Features with Self Organizing Map
by Vibhav Prakash Singh, Ashim Gupta, Rajeev Srivastava
Abstract: Content based image retrieval is an emerging area in computer vision, in which we retrieve similar images from the huge set of database on the basis of their own visual content. Most of the image retrieval systems are still, incapable of providing better retrieval results in less searching time. In this paper, we introduce Self Organizing Map (SOM) clustering approach with fusion of features. Using SOM, system performances are improved by the learning and searching capability of the neural network. Here, first we extract color moment, color histogram, local binary pattern, color percentile, and wavelet transform based color and texture features. All these features are computationally light weighted, speedup the process of image indexing. Hereafter, all these features sets are fused together with equal weight. Then, these hybrid features are fed to SOM which generates clusters of images, having similar visual content. SOM produces different clusters with their centers. Further, query image content are matched with all cluster representative to find closest cluster. Finally, images are retrieved from this closest cluster using similarity measure. So, at the searching time the query image is searched only in small subset depending upon cluster size and is not compared with all the images in the database, reflects a superior response time with good retrieval performances. Experiments on benchmark database show that the proposed clustering with hybrid features performs significantly encouraging.
Keywords: Feature Extraction; Self Organizing Map; Content Based Image Retrieval; Searching; Similarity Measure.
TripletDS: A prototype of dataspace system based on triple data model
by Mrityunjay Singh, S.K. Jain
Abstract: A dataspace system provides a powerful mechanism for searching and querying the structured, semi-structured, and unstructured data in an integrated manner. This paper aims to build a prototype called as Triplet Dataspace System (TripletDS) to provide an on-demand large scale data integration solution with less effort. The TripletDS is a prototype of dataspace system based on triple model. The triple model is a simple and flexible data model which supports the Subject-Predicate-Object (SPO) query language. The proposed prototype has the ability to efficiently bridge the gaps between syntactic and structural heterogeneity among data. The performance of TripletDS has been verified on the data sets including personal data and relational data.
Keywords: TripletDS; Dataspace; TripletDSpace; Triple Model; DSP Tool; Transformation Rules.
A Fireworks Algorithm for Solving Traveling Salesman Problem
by Zoubair Taidi, Lamia Benameur, Jihane Alami Chentoufi
Abstract: In this paper, a novel swarm intelligence algorithm inspired by observing fireworks explosions, called Fireworks Algorithm (FW), is proposed for solving the traveling salesman problem (TSP). The TSP is a well-known NP-hard combinatorial optimization problem. The problem is easy to state, but hard to solve. Many real-world problems can be formulated as instances of the TSP, for example, computer wiring, vehicle routing, crystallography, robot control, drilling of printed circuit boards and chronological sequencing. The proposed algorithm has been performed on TSP instances taken from TSPLIB library and has been compared with other methods in the literature. Computational results showed that the proposed firework algorithm is competitive in terms of quality of the solutions compared to other techniques.
Keywords: Meta-Heuristic; Fireworks Algorithm; Optimization; Swarm Intelligence; Traveling salesman problem.