Forthcoming articles

 


International Journal of Computational Systems Engineering

 

These articles have been peer-reviewed and accepted for publication in IJCSysE, but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.

 

Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.

 

Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.

 

Articles marked with this Open Access icon are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.

 

Register for our alerting service, which notifies you by email when new issues of IJCSysE are published online.

 

We also offer RSS feeds which provide timely updates of tables of contents, newly published articles and calls for papers.

 

International Journal of Computational Systems Engineering (39 papers in press)

 

Regular Issues

 

  • Finding the Best Bug Fixing Rate and Bug Fixing Time Using Software Reliability Modelling   Order a copy of this article
    by Rama Rao 
    Abstract: This article is mainly focused on finding the best possible way to rectify Bug Fixing Rate (BFR) and Bug Fixing Time (BFT). Further, versatile software projects have been verified when materializing the bug fixing rate. To increase the bug fixing rate, bug traceability is reduced by virtue of version tag in each and every component of a software deliverable. Software build release time is optimized by using mathematical optimization techniques such as software reliability growth and non-homogeneous poisson process models. This is very much essential in present market scenario. The build inconsistency and automation are also rectified in this erudite research work. The developed software is free from defects and improves the software quality by increasing bug fixing rate.
    Keywords: Bug Fixing Rate; Bug Fixing Time; Bug Traceability Time; Software Build Automation; Software Reliability; Version Tag; Software Risk and Version Control System.

  • Evolutionary Optimisation to Minimise Material Waste in Construction   Order a copy of this article
    by Andy Connor, Wilson Siringoringo 
    Abstract: This paper describes the development and evaluation of a range of metaheuristic search algorithms applied to the optimal design of two-dimensional layout problems, with the particular application on residential building construction. Results are presented to allow the performance of the different algorithms to be compared in the pareto-optimal solution space, with resulting solutions identified and analysed in the objective space. These results show that all of the algorithms investigated have the potential to be applied to optimise material layout and improve the design processes used during building construction.
    Keywords: Metaheuristic algorithms; Evolutionary computation; Layout optimisation; Residential construction.

  • Study and Analytical Perspective on Big Data   Order a copy of this article
    by Yashika Goyal, Yuvraj Monga, Mohit Mittal 
    Abstract: From past era, a great advancement has been envisioned in the technological world. It is exponentially expanding in field of virtualization. Due to this, every field comes under digitalization. Wired as well as wireless communication is completely working on digitalized form. Conclusively, expansion of huge amount of data has been seen. To manage these large chunks of information scientists and research are focused on Big Data. Big data has capability for paradigm shift of the prevalent IT services. In this paper, we will focus on terminology of big data, applications and various tools used to manage big data
    Keywords: Data; Big Data Analytics; Data Mining; Big Data applications; Big Data tool; Future trends.

  • A SURVEY OF MACHINE LEARNING TECHNIQUES   Order a copy of this article
    by Devi I, Karpagam G R, Vinothkumar B 
    Abstract: Artificial Intelligence (AI) allows the systems to observe from environments, perform certain functionalities and aims to maximize the probability of success in solving real world problems. With the technological enhancements and scientific growth, AI turns out to be a field of interest. Thus, it leads to the amplified focus on Machine Learning (ML) techniques. Machine learning (ML) is the most important data analysis methods which iteratively learn from the available data by using learning algorithms. The present survey provides the theoretical representation and basic methodologies of machine learning techniques like Support Vector Machine (SVM), K-Nearest Neighbors (KNN), Decision tree, Bayesian networks, Clustering, Hidden Markov Model (HMM) and Neural networks. This survey paper provides the influence of machine learning techniques like clustering, SVM and ANN on image compression and attention to the existing scope for the image compression with machine learning.
    Keywords: Machine Learning; supervised learning; unsupervised learning; support vector machine; artificial neural networks and Image Compression.

  • Financial Crisis and Technical Progress 2008-2016. The Parallel Shift in Information Technology   Order a copy of this article
    by Marius Balas 
    Abstract: The paper discuss the paradoxical relationship between the 2008 global financial crisis, which triggered a widespread economic recession, and the technical progress rate, which blossomed the following years. We explain this phenomenon by the significant decreasing of the financial leverage caused by the crisis. The automotive industry, with the launching of the fully electric car is illustrating the thesis. In IT the crisis triggered a shift towards the parallel computing. Outstanding achievements in avionics, biomedical/health care equipment, or dataflow computing reveal a strategic superiority of parallel systems (FPGA/ ASIC) over the conventional CPU bus oriented computing devices (von Neumann architectures: microcontrollers, DSPs, etc.) in most respects: speed, energy consumption, size, weight or reliability. Still the parallel shift implies new paradigms, costs and efforts and the mainstream of the IT establishment has been reluctant so far to the parallel shift. Last events show a new attitude and interest for parallel architectures.
    Keywords: financial crisis; financial leverage; fully electric car; von Neumann architecture; gate array; parallel computing.

  • A Comparative Evaluation of Microwave Antenna Designs performance on Digital MR Image in Hyperthermia System   Order a copy of this article
    by Minu Sethi 
    Abstract: Various hyperthermia techniques have addressed a wide variety of clinical and technical issues. Still there is a scope for improvement in them particularly in surrounding region of tumor where healthy tissues are affected by heat. An efficient hyperthermia system is presented to evaluate and compare the performance of rectangular E-shape microwave antenna with two different circular shape antennas on digital MR (magnetic resonance) Image of brain tumor that caters all the limitations. The antennas are designed on IE3D software for centre frequency of 2.45 GHz which is most commonly used microwave frequency in hyperthermia system defined for standard medical applications. A Program is developed using MATLAB that takes raw data of return loss from different antennas designed in IE3D software and produces an analog input signal. Thermal Imaging is done by applying the EM (electromagnetic) signal on the ROI (region of interest) in MR (medical resonance) Image of brain tumor. Some of the features of the program are: (1) Rectangular binary masking is done on MR Image of brain tumor to create ROI. (2) Thin plate with triangular element is meshed on the original tumor image. (3) PDE (partial differential equation) is solved to define temperature in a thin plate. The system combines the data processing and analysis. It is simple and easy to use in order to obtain both quantitative values and images of temperature changes. For real time analysis the antenna designs are fabricated and tested. Heating up to 45 Degree Centigrade with minimal damage to surrounding healthy tissues is the prime focus of research. The evaluated results illustrate the potential of designed antennas for hyperthermia treatment applicators.
    Keywords: microwave signal; EM (electromagnetic); MR (Magnetic Resonance) Image; hyperthermia; tumor; micro strip patch; PDE (Partial differential equation); ROI (region of interest); return loss.

  • An Empirical Evaluation of Memory Less and Memory Using Meta-Heuristics for solving Travelling Salesman Problem   Order a copy of this article
    by Arun Prakash Agrawal, Arvinder Kaur 
    Abstract: In many situations, a researcher gets bewildered when it comes to selection of an appropriate metaheuristic algorithm for any specific problem. Metaheuristic algorithms need to be categorized depending on their ability to solve problems of varying degree of complexity to overpower such confusion. Considering this view, the performance evaluations of six popular metaheuristic algorithms have been done. Three research questions are framed to evaluate the hypothesis for any difference in the performance of memory less and memory based metaheuristics. Domain of inquiry in this paper is travelling salesman problem. Extensive experiments are conducted and results are analyzed using various statistical tests such as F-test, and Post-hoc tests. An obvious outcome of this study is that there is an interaction effect between the problem sizes and the metaheuristic used and no clear superiority of one metaheuristic over the other.
    Keywords: Optimization problems; meta-heuristics; Travelling Salesman Problem.

  • Cross Layer based Modified Virtual Backoff Algorithm for Wireless Sensor Networks   Order a copy of this article
    by Ramesh Babu Palamakula, P. Venkata Krishna 
    Abstract: Since decade and more, there is a tremendous improvement in the usage of wireless sensor networks in various applications. The network operation is guaranteed to be effective and fruitful with efficient MAC protocol. Hence, this paper proposes the Cross Layer based Virtual Backoff Algorithm (CLM-VBA). The cross layer architecture is designed in this paper which involves three layers: physical layer, MAC layer and Network layer. The priority is given to the neighbouring nodes which are determined using the cross layer architecture. Two different counters are maintained to keep track of number of accesses and number of attempts along with sequence number. The delay sensitive applications are given preference when compared to the delay insensitive applications. Sleep mode is used for each node in order to conserve energy at each node. A buffer is maintained at each node in order to improve the performance of the system. The proposed algorithm CLM-VBA is compared with VBA, S-MAC and M-VBA in terms of delay, packet delivery ratio, energy consumption and number of collisions and proved to be better.
    Keywords: backoff; channel access; cross layer; MAC; collision.

  • Novel Approximation based Dynamical Modelling and Nonlinear Control of Electromagnetic Levitation System   Order a copy of this article
    by Ravi Gandhi, Dipak Adhyaru 
    Abstract: This paper presents the novel design of dynamical model for Electromagnetic Levitation System (EMLS) as a function of electromagnetic coil inductance in electrical and mechanical subsystems using Novel Approximation (NA). Proposed inductance model satisfies the requirement of electromagnetic force with the higher order polynomials (i.e. 2nd and 3rd order) in the denominator to best fit the experimental data. To stabilize the nonlinear EMLS in the larger operating range, the Input-Output Feedback Linearization (IOFL) based nonlinear controller is designed and implemented. Theorem 1 to guarantee the stability for the EMLS in the large region on the basis of quadratic Lyapunov function is proposed. The investigations reveal that the proposed controller provides smooth and fast stabilizing and tracking control response for different kind of trajectories. Under bounded vertical disturbances (i.e. sinusoidal and random types), proposed controller maintains the robustness.
    Keywords: magnetic levitation; novel approximation; feedback linearizing nonlinear control; Lyapunov function; stability.

  • Soft computing based on a selection index method with risk preferences under uncertainty: applications to construction industry   Order a copy of this article
    by H. Gitinavard, N. Foroozesh, S.Meysam Mousavi 
    Abstract: Decision making in construction industry is a dynamic procedure which is concerned with choosing a reasonable strategy to accomplish the stated objectives. It is the way towards choosing a feasible alternative from an arrangement of options. When the option has just a single criterion, its nature is unsurprising which settles on less complicated decision making. Uncertainty happens when there is no information to take care of the issue. In this paper, a new developed hesitant fuzzy preference selection index (HFPSI) method in view of another soft computing approach with risk preferences of decision makers (DMs) is proposed to manage multi-criteria decision making (MCDM) issues in construction industry while applying hesitant fuzzy sets (HFSs) to represent the uncertain information under hesitant uncertainty. The proposed technique notwithstanding considering subjective surveying criteria, respects the DMs' troubles in deciding the membership of an element into a set. Furthermore, its best option decision depends on finding an option that at the same time considers the ideas of preference relation and hesitant fuzzy. The proposed HFPSI approach is actualized by utilizing two real case studies in the construction industry and the results are examined. The studied cases for the best construction project and the best contractor also demonstrate the effectiveness of applying the proposed method while considering a group of the DMs ideas and hesitancy. It was also shown through the application cases that the method can help DMs in reaching a reliable decision under uncertain environments through using HFSs.
    Keywords: Group decision making; hesitant fuzzy sets (HFSs); construction project selection problem; contractor selection problem.

Special Issue on: New Challenges in Intelligent Computing and Applications

  • Dynamic Priority based Packet Handling protocol for Healthcare Wireless Body Area Network system   Order a copy of this article
    by Sapna Gambhir, Madhumita Kathuria 
    Abstract: The vision of Wireless Body Area Network (WBAN) is to facilitate, improve, and have an immense impact on healthcare system in terms of identifying the risk level or severity factor of a patient in various emergencies. Modern and technical advances in WBAN revolutionize this area for autonomous monitoring of vital signals for a longer duration as well as from a remote place. However, handling of heterogeneous packets in a fast changing healthcare scenario has continued to be an opportunity for exploration. We present a novel concept of Dynamic Priority based Packet Handling (DPPH) which promises to add exciting capabilities to the world of WBANs. DPPH uses the principles of accurate identification and classification of heterogeneous packets to effectively determine patients critical condition and alerts the medical server. In this paper, we have focused on dynamic prioritization based queuing, scheduling, resource allocating and alerting policies for performance enhancement. The proposed approach is validated through a comparison with the existing approach. The performance of the proposed protocol is implemented using a network simulator NS-2.35 and is judged on the basis of packet delivery ratio, loss ratio, end-to-end delay, and throughput, with variation in nodes.
    Keywords: Alert; Abnormal condition; Weighted Deviation; Detection; Prioritization;Packet Handling;Vital signal; Wireless Body Area Network.

  • Development Of Software Effort Estimation Using a Non Fuzzy Model.   Order a copy of this article
    by H. Parthasarath Patra, Kumar Rajnish 
    Abstract: Nowadays accurate estimation of the software effort is a challenging issue for the modern software developers. To bind a contract depends purely on the estimated cost of the software. Over estimate or under estimate lead a loss or gain of the software project and also increase the probability of success and failure of the project. In this paper we have used a non fuzzy conditional algorithm to build a suitable model to estimate the software effort taking NASA software projects data. We have planned to set of linear conditional models using the domain of possible KLOC (Kilo Lines of Code). The performance of developed model has been analyzed using NASA data set [1] and we compare with the result of COCOMO tuned-PSO, Halstead, Walston-Felix, Bailey-Basili and Doty models were provided.
    Keywords: Lines of code; Software cost estimation; MRE; MMRE; PRED.

  • HiRSA: Computing Hit Ratio for SOA applications through Tcases   Order a copy of this article
    by Arpita Dutta, Haripriya Kunsothh, Sangharatna Godboley, Durga Prasad Mohapatra 
    Abstract: In this article, we propose a novel method for black-box test case generation for Business Process Execution Language (BPEL) processes. We also propose a method to compute Hit Ratio percentage metric. We also compute the total time taken for test case generation and the speed of test case generation. We design the Service-Oriented Architecture (SOA) based applications using OpenESB tool. We have developed a code converter to generate Tcases compatible .xml code because the OpenESB generated .xml code is incompatible with Tcases input framework. After that, we compute the Hit Ratio percentage with the help of Hit Ratio Calculator. We have experimented with twelve SOA based applications, on an average, we achieve 62.78% of Hit Ratio with an average time of 873.08 ms and speed of 32.41 approx. 32 test cases per second.
    Keywords: Service-Oriented Architecture; SOA; Hit Ratio; Black-box testing; Tcases.

  • An Effective Topic-based Ranking Technique for Categorized Research Articles   Order a copy of this article
    by Rajendra Kumar Roul, Jajati Keshari Sahoo 
    Abstract: The number of research articles is increasing very rapidly on the web due to the large volume of research work happening everyday. Maintaining and searching the require articles according to the user requirements is the need of the hour. Classification and ranking are the two important techniques of information retrieval which can shed light in this direction. This paper proposes an effective ranking approach which is the follow-up of our earlier classification work in which by using the keywords extracted from the textit{keyword section} of the articles, a huge volume of articles are classified into their respective categories. To rank these articles in each category, the proposed ranking approach uses Latent Dirichlet Allocation which transforms the text into the topics and then applies inverted indexing technique on it. Five benchmark datasets are used for experimental work. Results obtained from the experiment indicate that the performance of the proposed ranking technique is promising.
    Keywords: Hierarchical Classification; Inverted Indexing; Latent Dirichlet Allocation; Ranking; Text Categorization.

  • Exception Discovery Using Ant Colony Optimization   Order a copy of this article
    by Saroj Ratnoo, Amarnath Pathak, Jyoti Ahuja, Jyoti Vashishtha 
    Abstract: Ant Colony Optimization (ACO) algorithms have been used to discover accurate, and comprehensible classification rules. Discovering exceptions using ACO is an underexplored area of research. Most of the classification algorithms focus on discovering rules with high generality. Since exceptions have low support, these often get ignored as noise. This paper proposes an Ant Colony Optimization (ACO) based algorithm to discover classification rules in If-Then-Unless framework, where the Unless part contains exceptions. We have conducted experiments on ten datasets from the UCI machine learning repository. The suggested algorithm is found to be competitive with the two well known ACO based classification algorithms (Ant-Miner and cAnt-MinerPB) with respect to predictive accuracy and comprehensibility. The algorithm has been able to capture a number of exceptions across several datasets. The classification rules discovered with exceptions are accurate, semantically comprehensible and interesting. These rules provide an opportunity to amend ones decision in exceptional circumstances.
    Keywords: Ant Colony Optimization; Classification; Exception Discovery; Rule Mining.

  • Reduction of Computation time in Differential Evolution based Quantization table optimization for the JPEG baseline algorithm   Order a copy of this article
    by Vinoth Kumar B, Karpagam GR 
    Abstract: The design of Quantization table is viewed as an optimization problem because the quantization table produces the compression/quality trade-off in baseline Joint Photographic Experts Group algorithm. In this paper, efforts have been taken to reduce the computation time of the Differential Evolution (DE) algorithm by using the surrogate model. This paper applies a Problem Approximation Surrogate Model (PASM) to assist DE algorithms for optimizing the quantization table. It also analyzes the performance of PASM in DE algorithm based on approximation error and evolutionary perspective. In addition, it confirms the results using statistical hypothesis tests. PASM is integrated in Classical Differential Evolution and Knowledge based Differential Evolution algorithms. Different benchmark images are used to validate the PASM performance in DE algorithms for three target bits per pixel. The result shows that integrated PASM in DE algorithms reduces the computation time and guarantees the similar results as DE algorithms without a model.
    Keywords: Differential Evolution; Knowledge based Differential Evolution; Surrogate Model; Fitness Approximation; Problem Approximation; Image Compression; JPEG; Quantization table; Optimization; Meta-Heuristic search; ANOVA and Wilcoxon Signed rank test.

  • A Graph Based Approach for Feature selection from Higher Order Correlations   Order a copy of this article
    by Sunanda Das, Asit Kumar Das 
    Abstract: Graph technology emerges as an important topic in the field of data mining and machine learning community. The analysis of high-dimensional data is crucial to identify a smaller subset of features which are informative for classification and clustering. In this paper, an efficient graph feature selection method is proposed to render the analysis of high-dimensional data tractable. Here, the feature scores are calculated for obtaining the weights of the edges in the weighted graph to identify the optimal feature subset. One advantage of this method is that it can successfully identify the optimal features for machine learning. The experimental results on our dataset verify the effectiveness and efficiency of the proposed method.
    Keywords: Graph technology; Score; Correlation; Feature Selection.

Special Issue on: Biomedical Signal and Imaging Trends and Artificial Intelligence Developments

  • Comparative studies of Discrete Cosine transform (DCT) and Lifting Wavelet Transform (LWT) techniques for compression of Blood Pressure Signal in Salt Sensitive Dahl Rat   Order a copy of this article
    by Vibha Aggarwal, Manjeet Singh Patterh, Virinder Kumar Singla 
    Abstract: This paper introduces a study based on quality controlled Discrete Cosine transform (DCT) and Lifting Wavelet Transform (LWT) based compression method for Blood Pressure Signal Compression in Salt Sensitive Dahl Rat. The transformed coefficients are thresholded using the bisection algorithm to match the predefined user specified percentage root mean square difference (PRD) within the tolerance. Then, the binary lookup table is made to store the position map for zero and non-zero coefficients (NZC). The NZC are quantized by Max-Lloyd quantizer followed by Arithmetic coding. Lookup table is encoded by Huffman coding. The results are presented on different Blood Pressure signals of varying characteristic. There is no significant difference in before quantization PRD (BPRD) and after quantization PRD (QPRD) in various signals in both transforms. Mean compression ratio increases with an increase in user define PRD (UPRD).
    Keywords: Blood Pressure signal in Salt Sensitive Dahl Rat; Compression; Nonlinear transform; Linear transform.

  • B-mode breast ultrasound image segmentation techniques: an investigation and comparative analysis   Order a copy of this article
    by Madan Lal, Lakhwinder Kaur, Savita Gupta 
    Abstract: Breast cancer is the second leading reason for death among women. A commonly used method for detection of breast cancer is ultrasound imaging. Ultrasonic imaging is a low cost, easy to use, non-invasive and portable process, but it suffers from acoustic interferences (speckle noise) and other artifacts, As a result, it becomes difficult for the experts to directly identify the exact shapes of abnormalities in these images. Numerous techniques have been proposed by different researchers for visual enhancement and for segmentation of lesion regions in Breast Ultrasound images. In this work, different automatic and semi-automatic Breast Ultrasound image segmentation techniques have been reviewed with a brief explanation of their different technological aspects. Performance of selected methods has been evaluated on a database of 45 B-mode Breast Ultrasound images containing benign and malignant tumors (25 benign and 20 malignant). For performance analysis of the segmentation methods, manually delineated images by an expert radiologist are used as ground truth images whereas boundary and area error metrics are used for comparison of quantitative results.
    Keywords: B-Mode Breast Ultrasound (BUS) Image; Speckle Noise; Thresholding; Region Growing; Fuzzy Clustering; Watershed; Active Contour; Level Set.

  • An improved unsupervised mapping technique using AMSOM for neurodegenerative disease detection   Order a copy of this article
    by ISHA SUWALKA, Navneet Agrawal 
    Abstract: The most challenging aspect in medical imaging is the accuracy of detection of neurodegenerative diseases .The advent of new imaging techniques has yet limited manual evaluations, manual reorientation and other time consuming limitations with reduced resolution . Therefore, there is a need to develop efficient algorithm for proper detection with quantitative information of significance for the clinicians. The proposed algorithm includes improved adaptive moving self organizing mapping (AMSOM) which trains the extracted features along with Mini-Mental State Examination (MMSE) factor and volumetric parameter using Volume based method (VBM) for computing feature data set which in total improves time iteration rate ,mean square error ,sensitivity and accuracy. The algorithm is improved version of moving mapping method which on one hand tackles drawback of SOM of fixed grid mapping and improves neighbourhood function of neuron which provides better detection and classification yielding promising results. It further improves performance of AMSOM by better visualization of the input dataset and provides a framework for determining the optimal number and structure of neurons. This paper uses real MRI dataset taken from OASIS having a cross-sectional collection of 416 subjects aged 18 to 96. The analysis includes different comparison of mapping approaches that reveals features associated to the Alzheimer Disease.
    Keywords: self organizing mapping for MRI image; hierarchical mapping with GHSOM; e- database using OASIS ;moving neuron concept using AMSOM;clustering for detection of Alzheimer Disease.

  • Active Contours using Global Models for Medical Image Segmentation   Order a copy of this article
    by Ramgopal Kashyap, Vivek Tiwari 
    Abstract: Accurate segmentation with denoising are subject of research in the field of medical imaging and computer vision. This paper presents an enhanced energy based active contour model with a level set detailing. Local energy fitting term impacts neighborhood drive to pull the shape and restrict it to protest limits. Thus, the global intensity fitting term drives the movement of contour at distant from the object boundaries. The global energy term depends on worldwide division calculation, which can better catch energy data of picture than Chan-Vese (CV) model. Both neighborhood and worldwide terms are commonly absorbed to build a vitality work in light of a level set plan to portion images with force inhomogeneity, Experiments demonstrate that the proposed model has the upside of commotion resistance and is better than conventional image segmentation. Results demonstrate that the proposed method performs better both subjectively and quantitatively contrasted with other best in class methods.
    Keywords: Denoising;Energy based active contour;Image segmentation; Intensity inhomogeneity; Local binary fitting; Local region based active contour.

  • Application of Ensemble Artificial Neural Network for the Classification of White Blood Cells using Microscopic Blood Images   Order a copy of this article
    by Jyoti Rawat, Annapurna Singh, Harvendra Singh Bhadauria, Jitendra Virmani, Jagrtar Singh Devgun 
    Abstract: Introduction: This work exhibits an application of ensemble artificial neural network for the white blood cell classification. The incentive for experimenting with the Artificial neural network (ANN) based computer aided classification (CAC) design is that these designs based on ensemble methods are expected to yield a better outcome in comparison to the outcome achieved by CAC system designs based on single multiclass classifier. In recent times, digital image processing technique has been widely utilized as a part of health diagnosis. In order to overcome the problems of manual diagnosis in recognizing the morphology of blood cells, the automated analysis is frequently utilized by a pathologist. In the pathology lab, white blood cells are analyzed by an expert that is a suspicious and subjective task. Remembering the end goal to enhance the precision, an automatic white blood cell classification framework is crucial for helping the pathologists in diagnosing various haematological disorders like leukemia or lymphoma. This work gives a semi-automated technique to identify and classify white blood cell. Method: In this work, a k-means clustering algorithm is used to segment the nucleus by upgrading the district of the white blood cell nucleus and stifling the other components of the blood smear images. From each cell image, different features, such as shape, chromatic and texture, are extracted. This feature set was used to train the classifier in order to determine different classes of white cell. Results: Performing this evaluation, classification models allowed us to establish that CAC system design based on the ensemble artificial neural network is the most suitable model for the four class white cell classification, with an accuracy of 95 %. Conclusions: The proposed method analyzes the blood cells automatically via image processing techniques, and it represents a medicinal method to avoid the plentiful drawbacks associated with the labour-intensive examination of white cells.
    Keywords: White blood cell; Segmentation; k-means clustering; Texture features; Shape features; Chromatic features; Artificial neural network classifier.

  • A Computerized Framework for prediction of fatty and Dense Breast Tissue Using Principal Component Analysis and Multi-resolution Texture Descriptors   Order a copy of this article
    by Indrajeet Kumar, Harvendra Singh Bhadauria, Jitendra Virmani 
    Abstract: The present work proposes a computerized framework for prediction of fatty and dense breast tissue using principal component analysis and multi-resolution texture descriptors. For this study 480 MLO view digitized screen film mammograms have been taken from the DDSM dataset. A fixed ROIs size of 128
    Keywords: Mammography; Breast density classification; Multi-resolution texture descriptors; principal component analysis; Support vector machine (SVM) classifier.

  • GPU-based Focus-Driven Multi-coordinates Viewing System for Large Volume Data Visualization   Order a copy of this article
    by PIYUSH KUMAR, ANUPAM AGRAWAL 
    Abstract: The advancements in biomedical scanning modalities such as Computed Tomography (CT), Positron Emission Tomography (PET) and Magnetic Resonance Imaging (MRI) are improving in their resolution day by day. The newly physicians may face some problem rely on exploring 2D slices and diagnosing with 3D full humans anatomy structure at the same time. In this paper, we are presenting a generalized contactless interactive Graphics Processing Unit (GPU) accelerated Compute Unified Device Architecture (CUDA) based focus and context visualization approach with displaying the inner anatomy of the large scale visible human male dataset in Multi-Coordinate Viewing System (MCVS). The focusing area has been achieved by 3D Cartesian Region of Interest (ROI). The large dataset has been structured by using Octree method. The volume rendering part has been done by using an improved ray intersection cube method for voxels with the ray casting algorithm. The final results would allow the doctors to diagnose and analyze the atlas of 8-bit CT-scan data using three dimensional visualization with the efficient frame rate rendering speed in multi-operations like zooming, rotating, dragging. The system is tested for multiple types of 3D medical datasets ranging from 10 MB to 3.15 GB. Medical practitioners and physicians are able to peer inside of the dataset to use the features of the inner information. This system is further tested with three NVIDIA CUDA enabled GPU cards for the performance analysis. The scope of this system is to explore of the human body for surgery purpose.
    Keywords: Volume Visualization; Focus-driven; MCVS; Focus and Context; MRI dataset; Medical dataset;.

  • Volumetric Tumor Detection Using Improved Region Grow Algorithm   Order a copy of this article
    by Shitala Prasad, Shikha Gupta 
    Abstract: This paper works on segmentation of brain pathological tissues (Tumor, Edema an Narcotic core) and visualize it in 3D for their better physiological understanding. We propose a novel approach which combines threshold and region grow algorithm for tumor detection. In this proposed system, FLAIR and T2 modalities of MRI are used due to their unique ability to detect the high and low contrast lesions with great accuracy. In this approach, first the tumor is segmented from an image which is a combination of FLAIR and T2 image using a threshold value, selected automatically based on the intensity variance of tumor and normal tissues in 3D MR images. Then the tumor part is extracted from the actual 3D MRI of brain by selecting the largest connected volume. To correctly detect tumor 26 connected neighbors are used. The method is evaluated using a publically available BRAT dataset of 80 different patients having Gliomas tumors. The accuracy in terms of detection is reached to 97.5\% which is best compared to other state-of-the-art in given time frame. The algorithm takes 4-5 minutes for generating the 3D visualization for final output.
    Keywords: 3D Volumetric; Brain Tumor; Region Growing Algorithm; Thresholding; Volexl Seeding.

  • Multimodality Medical Image Fusion using Nonsubsampled Rotated Wavelet Transform for Cancer Treatment   Order a copy of this article
    by Satishkumar Chavan, Abhijeet Pawar, Sanjay Talbar 
    Abstract: This paper presents nonsubsampled rotated wavelet transform (NSRWT) based feature extraction approach to multimodality medical image fusion (MMIF). The nonsubsampled rotated wavelet filters are designed to extract textural and edge features. These filters are applied on axial brain images of two modalities namely Computed Tomography (CT) and Magnetic Resonance Imaging (MRI) to extract spectral features. These extracted features are selected using entropy-based fusion rule to form new composite spectral feature plane. Entropy-based fusion rule preserves dominant spectral features and imparts all relevant information from both the modalities to the fused image. The inverse nonsubsampled rotated wavelet transform is applied to reconstruct fused image from the composite spectral slice. The proposed algorithm is evaluated using 39 pilot image slices of 23 patients subjectively and objectively for efficient fusion. Three expert radiologists have verified the subjective quality of fused image to ascertain anatomical structures from source images. Subjective score by radiologists reveals that the fused image using proposed algorithm is superior in terms of visualization of abnormalities over other wavelet based techniques. The objective evaluation of fused images involves estimation of fusion parameters like image quality index (IQI), edge quality measure (EQa,b), mean structural similarity index measure (mSSIM), etc. The proposed algorithm presents better performance metrics over the state of the art wavelet based algorithms.
    Keywords: Multimodality Medical Image Fusion; Discrete Wavelet Transform; Rotated Wavelet Filters; Nonsubsampled Rotated Wavelet Transform; Cancer Treatment; Radiotherapy.

  • Comparison of feature extraction techniques for classification of hardwood species   Order a copy of this article
    by Arvind R. Yadav, R.S. Anand, M.L. Dewal, Sangeeta Gupta, Jayendra Kumar 
    Abstract: The texture of an image plays an important role in identification and classification of images. The hardwood species of an image contains four key elements namely, vessels (popularly known as pores in cross-section view), fibers, parenchymas and rays, useful in its identification and classification. Further, the arrangements of all these elements posses texture rich features. Thus, in this work investigation of existing texture feature extraction techniques for the classification of hardwood species have been done. The texture features are extracted from grayscale images of hardwood species to reduce the computational complexity. Further, linear support vector machine (SVM), radial basis function (RBF) kernel SVM, Random Forest (RF) and Linear discriminant analysis (LDA) have been employed as classifiers to investigate the efficacy of the texture feature extraction techniques. The classification accuracy of the existing texture descriptors has been compared. Further, Principal component analysis (PCA) and minimal-redundancy-maximal-relevance (mRMR) feature selection method is employed to select the best subset of feature vector data. The PCA reduced feature vector data of co-occurrence of adjacent local binary pattern (CoALBP24) texture feature extraction technique has attained maximum classification accuracy of 96.33
    Keywords: Texture features; support vector machine; feature selection; hardwood species.

  • Myoelectric Control of Upper Limb Prostheses using Linear Discriminant Analysis and Multilayer Perceptron Neural Network with Back Propagation Algorithm   Order a copy of this article
    by Sachin Negi, Yatindra Kumar, V.M. Mishra 
    Abstract: Electromyogram (EMG) signals or myoelectric signals (MES) have two prominent areas in the field of biomedical instrumentation. EMG signals are primarily used to analyse the neuromuscular diseases such as myopathy and neuropathy. In addition the EMG signal can be utilized in myoelectric control systems- where the external devices like upper limb prostheses, intelligent wheelchairs, and assistive robots can be controlled by acquiring surface EMG signals. The aim of present work is to obtain classification accuracy first by using linear discriminant analysis (LDA) classifier where principal component analysis (PCA) and uncorrelated linear discriminant analysis (ULDA) feature reduction techniques are used for upper limb prostheses control application. Next the multilayer perceptron (MLP) neural network with back propagation algorithm is used to calculate the classification accuracy for upper limb prostheses control.
    Keywords: EMG; MCS; LDA; PCA; ULDA; MLP; Back propagation.

  • Comparative Study of LVQ and BPN ECG Classifier   Order a copy of this article
    by Ashish Nainwal, Yatindra Kumar, Bhola Jha 
    Abstract: ECG is the electrical waveform og heart activity.It contains much information on Heart disease. It is very important to diagnosis the heart disease as soon as possible otherwise it can be harmful to patient. This paper presents to classify ECG signal using learning vector quantization and Beck propagation neural network and feature of ECG (morphology and frequency Domain) features.In this paper the 45 ECG signals from MIT- BIH arrhythmia database are used to clssify in to two classes,one is normal and another one is abnormal using above mentioned classifier. Out of 45 signals 25 are normal and 20 are abnormal according to MIT-BIH. 28 morphological features and 4 frequency domain features are set as an input to the classifier. The performance of classifier measures in the terms of Sensitivity (Se), Positive Predictivity (PP) and Specificity (SP). The system performance is achieved with 82.35% accuracy using LVQ and 94.11% using BPN.
    Keywords: Back Propagation Neural Network; Learning Vector Quantization; ECG;rnMIT-BIH.

  • Automatic feature extraction of ECG signal based on adaptive window dependent Differential histogram approach and validation with CSE database   Order a copy of this article
    by Sucharita Mitra, Madhuchhanda Mitra, Basudev Halder 
    Abstract: A very simple and novel idea based on Adaptive window dependent Differential histogram approach has been proposed for automatic detection and identification of ECG waves with its characteristic features. To facilitate the estimation of the waves, the normalized signal has been divided into a few small windows by an Adaptive window selection technique. By counting the number of changes between successive samples as frequency, the Differential histogram has been plotted. Some of the zones having an area more than a pre-defined threshold are depicted as QRS zones. The local maxima of these zones are referred as the R-peaks. T and P peaks are also detected. Baseline point and clinically significant time plane features have been computed and validated with reference values of the CSE database. Proposed technique achieved better performance in comparison with CSE groups. Its accuracy is achieved in Sensitivity (99.86%), Positive Productivity (99.76%) and Detection accuracy (99.8%).
    Keywords: Adaptive window; Differential histogram; CSE database; baseline; Sensitivity; ECG signal; QRS zones; R-peaks; Distinctive point’s; Sample values.

  • A Comparative study on Kapurs and Tsallis entropy for multilevel thresholding of MR Images via Particle Swarm Optimization Technique   Order a copy of this article
    by Taranjit Kaur, Barjinder Singh Saini, Savita Gupta 
    Abstract: The present paper explores both the Kapurs and Tsallis entropy for a three level thresholding of Brain MR images. The optimal thresholds are obtained by the maximization of these entropies using a population-based search technique called as Particle swarm optimization (PSO). The algorithm is implemented for the segregation of various tissue constituents, i.e., cerebral spinal fluid (CSF), white matter (WM) and, gray matter (GM) region from the simulated images obtained from the brain web database. The efficacy of the thresholding methods was evaluated by the measure of the spatial overlap i.e. the Dice coefficient (Dice). The experimental results show that 1) For both the WM and CSF the Tsallis entropy outperforms the Kapurs entropy by achieving an average value of 0.967279 and 0.878031 respectively. 2) For the GM, the Kapurs entropy is more beneficial which is duly justified by the mean value of Dice which was 0.851025 for this case.
    Keywords: Kapur’s; Tsallis; multilevel thresholding; PSO.

Special Issue on: Advances in Computational Systems

  • Analysis of THD In Various Power Electronic Converters to Regulate the Voltage from Renewable Energy Sources   Order a copy of this article
    by Jency Joseph, Josh F. T., Ronaldo Lamare, Blessen Varghese Mathew 
    Abstract: This work presents a performance analysis of various power electronic converters with RL load to reduce the total harmonic distortion. The power converters inspected are: ZETA converter, single-ended primary inductance converter, often written SEPIC converter and Z SOURCE converter. The objective is to analyze which power electronic converter exhibits less total harmonic distortion (THD) and more efficiency in order to select the suitable converter for electric vehicle propulsion system. Three above mentioned converters are designed, modeled, and simulated. The zeta converter is advantageous over other converters inspected and have reduced total harmonic distortion , higher output power and hence improved efficiency. The simulations are done with MATLAB/SIMULINK and the results are presented.
    Keywords: Total Harmonic Distortion (THD); MATLAB simulation ,Z-SOURCE converter (ZSC); Zeta converter; Sepic converterrnrn.

  • SVM Classification of Brain images from MRI Scans using Morphological Transformation and GLCM Texture Features   Order a copy of this article
    by Usha Ramasamy, Perumal K 
    Abstract: This paper introduces a novel HTT based GLCM texture feature extraction procedure for an automatic MRI (Magnetic Resonance Images) brain image classification. The method has three phases, (1) Hierarchical Transformation Technique (HTT), (2) texture feature extraction and (3) classification. The new proposed HTT method incorporates optimum disk-shaped mask selection, top-hat and bottom-hat morphological operations, and some mathematical operation for both image pre-processing and enhancement. The gray level co-occurrence matrix is computed to extract statistical texture features such as contrast, correlation, energy, entropy, and homogeneity from an image. And these extracted images features of co-occurrence matrix can very well be fed into SVM (Support Vector Machine) for further MRI brain normal and abnormal image classification. The alternate approach of the HTT based GLCM also compared with conventional GLCM texture feature extraction method.
    Keywords: magnetic resonance images; classification; texture feature extraction; grey level co-occurrence matrix; support vector machine; top hat transform; bottom hat transform.

  • Performance Analysis of Lyapunov Stability Based and ANFIS Based MRAC   Order a copy of this article
    by Kalpesh Pathak, Dipak Adhyaru 
    Abstract: Analysis of two adaptive controller parameter adjustment laws for a model reference adaptive controller has been discussed in this paper. The comparison has been done about applying Lyapunov stability rule and using adaptive Neuro fuzzy inference system (ANFIS) to adjust parameter for model reference adaptive control. Discussion of the nature of system, adaptive controller, basic block diagram and control law has been presented. For intense analysis two case studies have been considered. Simulation of two bench-mark process control applications, level control in coupled tank and concentration control in biochemical reactor (BCR) has been discussed. Comparative results have been plotted and discussed for each proposed algorithm. Considered systems have mutual parameter interaction and nonlinear parameter dynamics. Introduction part discusses literature survey, development of the topic and importance of the work. Initially Lyapunov rule based technique has been applied for control in both cases. With ANFIS based algorithm, new values of adjustment parameter have been generated. Results shows that performance of ANFIS based MRAC gives improved results in presence of system uncertainties.
    Keywords: Coupled Tank; Biochemical Reactor; Model Reference Adaptive Control; Lyapunov Stability; ANFIS.

  • An Efficient Load Balancing Mechanism with Deadline Consideration on GridSim   Order a copy of this article
    by DEEPAK KUMAR PATEL, CHITA RANJAN TRIPATHY 
    Abstract: GridSim is a very popular Grid simulation tool. The GridSim toolkit is used to simulate application schedulers for different parallel and distributed computing systems such as Clusters and Grids. Many researchers have proposed various load-balancing techniques in Grid, but all those cannot be used in GridSim due to the structural differences of Grid. In this paper, we propose an enhanced load balancing method called Enhanced GridSim with Load Balancing based on Deadline Consideration (EGLBD) for GridSim. The proposed algorithm balances the load by providing an effective selection method for efficient scheduling of Gridlets among heterogeneous resources which maximizes the utilization of the resources and increases the efficiency of the Grid system. Also, the proposed algorithm shows the details of the load estimation method for every levels of GridSim. We simulate the proposed algorithm on the GridSim platform. The proposed mechanism on comparison is found to outperform the existing schemes in terms of finished Gridlets, unfinished Gridlets, total execution time and resubmitted time. The simulation results are presented.
    Keywords: GridSim; Gridlet; Gridresource; Load Balancing.

  • Pipel: Exploiting Resource Reorganization to Optimize Performance of Pipeline-Structured Applications in the Cloud   Order a copy of this article
    by Vinicius Meyer, Rodrigo Da Rosa Righi, Vinicius Facco Rodrigues, Cristiano André Da Costa, Guilherme Galante, Cristiano Both 
    Abstract: Workflow has become a standard for many scientific applications that are characterized by a collection of processing elements and an arbitrary communication among them. In particular, a pipeline application is a type of workflow that receives a set of tasks, which must pass through all processing elements (also named here as stages) in a linear fashion, where the output of a stage becomes the input of the next one. To compute each stage, it is possible to use a single compute node or to distribute its incoming tasks among the nodes of a cluster. However, the strategy of using a fixed number of resources can cause under- or over-provisioning situations, besides not fitting irregular demands. In addition, the selection of the number of resources and their configurations are not trivial tasks, being strongly dependent of the application and the tasks to be processed. In this context, our idea is to deploy the pipeline application in the cloud, so executing it with a feature that differentiates cloud from other distributed systems: resource elasticity. Thus, we propose Pipel: a reactive elasticity model that uses lower and upper load thresholds and the CPU metric to on-the-fly select the most appropriated number of compute nodes and virtual machines (VMs) for each stage along the pipeline execution. This article presents the Pipel architecture, highlighting load balancing and scaling in and out operations at each stage, as well as the elasticity equations and rules. Based on Pipel, we developed a prototype which was evaluated with a three stages graphical application and four different task workloads (Increasing, Decreasing, Constant and Oscillating). The results were promising, presenting an average gain of 38% in the application time when comparing non-elastic and elastic executions.
    Keywords: Cloud elasticity; Pipeline applications; Performance Optimization; Dynamic Resource Management; Adaptivity.

  • An Efficient Payload Distribution Method for High Capacity Image Steganography   Order a copy of this article
    by Sandeep Rathor, Anand Singh Jalal, Soumendu Chakraborty 
    Abstract: To produce higher level of security most of the irreversible and reversible image steganography techniques stress upon encrypting the secret image (payload) before embedding it to the cover image. In this case the steganographic system requires more computation time if a payload is large. Therefore, technique that can have lower computation time, higher embedding capacity and provoke same level of distortion, as any state of art encryption technique, can enhance the performance of stego systems. In this paper we propose a payload distribution method for secure secret image sharing that has lower computation time. The proposed scheme can be consider as a better option than any encryption technique. In the proposed scheme the payload is distributed over sign map (SM), error factor (EF) and normalized error (NE) using primary cover image. A payload is a grayscale image which is divided into signmap, error factor and normalized error then embedded into RGB secondary cover image using high capacity steganography algorithm. Proposed method is an alternate for encryption which reduces the overall complexity of the stego system. Result analysis shows that the distortion introduced into the resulting signmap, error factor and normalized error is significant enough to conceal the original payload with lower computation time than any state of art encryption schemes.
    Keywords: Irreversible Steganography; Payload; Cover Image; LSB; Sign Map Error Factor.

  • Armor on Digital Images Captured Using Photoelectric Technique by Absolute Watermarking Approach   Order a copy of this article
    by Suresh Annamalai 
    Abstract: Nowadays digital image captured through Photoelectric Technique have undergone with malicious modifications. The proposed paper tells on a high quality recovery of digital document using absolute watermarking approach. The recoveries of lost informations are identified using bit values. Whereas the problem on recovering scaled, rotated and translated images exist still. Thus the absolute watermarking approach assures the recovery of digital images from any format of manipulations providing high quality pictures. The different sort of bits used for this purpose is classified into audit bit, carrier bit and output bit. Here the consistent feature of the original image is coded and the output bit is protected using a carrier encoder. This enables the audit bit to detect the erasure locations and retrieve the manipulated areas of the image with high quality pictures in low cost.
    Keywords: Carrier Encoding; Tamper Proofing; SIFT; Predictive Coding; Spoofing Detection; Compression.

Special Issue on: Soft Computing Approaches in Wireless Networks with IoT and Medical Health Care System (SI-SWNMS)

  • Reduced Mutual Coupling MIMO Antenna   Order a copy of this article
    by Hari Krishna, MATURI THIRUPATHI 
    Abstract: Abstract: In this paper, a reduced mutual coupling 1x2 inset feed rectangular patch antenna is presented. The antenna elements are separated by a distance of λ0/4 exhibiting excellent isolation of -55 dB at 5 GHz band. To improve the isolation between closely placed antennas, a compact planar meander line based Electronic Bandgap Structure (EBG) behaves like a double negative (DNG) material is placed between them. The proposed EBG structure is implemented on the MIMO antenna with continuous as well as discontinuous ground plane. It is found that the EBG structure with discontinuous ground plane improves at least 6 dB isolation between antenna elements than continuous ground. The proposed antenna structures are fabricated showing good agreement between simulated and measured results.
    Keywords: Keywords: EBG Structure; MIMO Antenna; Miniature Antenna; Wideband Antenna.rn.