Forthcoming articles

International Journal of Intelligent Systems Technologies and Applications

International Journal of Intelligent Systems Technologies and Applications (IJISTA)

These articles have been peer-reviewed and accepted for publication but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.

Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.

Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.

Register for our alerting service, which notifies you by email when new issues are published online.

Open AccessArticles marked with this Open Access icon are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.
We also offer which provide timely updates of tables of contents, newly published articles and calls for papers.

International Journal of Intelligent Systems Technologies and Applications (25 papers in press)

Regular Issues

  • GEOSS: An Intelligent Methodology for Identifying Site Suitability of Air Sample Collection   Order a copy of this article
    by Kamonasish Mistry, Biplab Biswas, Siwen Zhang, Tao Wu, Liang Zhou, Abdelfettah Benchrif, Srimanta Gupta 
    Abstract: Epistemology of Air Pollution (AP) has well known through numerous researches and a few literature has proved that AP level changes with changes in land use land cover (LULC) types. However, there is no such attempt to develop any common methodology or model for optimum sampling which can be correlate between LULC types and changes with the AP level and changes. A pre-planned, well-calculated Geospatial method is an ultimate need to evaluate the ambient AP level, type and its variation over different LULC types. GEOSS (Geospatial Estimation of Optimum Sample Site) has been innovated to identify the optimum AP sampling sites so that it can represent the wide spatial coverage over varied LULC types. Image processing using geospatial techniques and statistical tools have used to select the optimum location of sampling. While few studies have collected AP samples from the field mostly used random sampling method, which always do not reflect all the LULC types, and most often, they are clustered in distribution, GEOSS has tried to overcome the major issues of random sampling with giving emphasis on geospatial techniques to select the optimum location of sites for sample collection. Validation approach based on Nearest Neighbour Analysis has justified that GEOSS employed sampling points are distributed that is more systematic and are fulfilled all the basic assumptions of the present sampling procedure.
    Keywords: Geospatial Modelling; Optimum location; Land use land cover; Kolkata Metropolitan Area; Air pollution level and change; Sampling techniques.
    DOI: 10.1504/IJISTA.2020.10031710
  • Study of the Fractal Nature of Evapotranspiration Time Series from Agricultural Regions of Northern Karnataka   Order a copy of this article
    by Uttam Patil, Nandini Sidnal 
    Abstract: Multifractal detrended fluctuation analysis (MFDFA) renders valuablerninsights pertaining to the randomness, inner regularity and long range correlations in the time series data. We applied this technique to assess the fractal behavior and inherent correlations of the potential and reference crop evapotranspiration data collected from two regions of Karnataka, viz., Belgaum and Raichur. The annual periodic nature of thernevapotranspiration data series was removed using seasonal trend decomposition method and it was seen that all the decomposed series consisted of long-term persistence. The multifractal behvaior in the evapotranspiration series at the two stations were seen owing to the strong dependency of generalized Hurst exponent H(q) on the values of q. We also tested for the reduction in the dependence of h(q) on q by shuffling thernevapotranspiration time series and it is indicative of the fact that the multifractality is responsible for the correlation characteristics and also the probability density function of the evapotranspiration data series. The two regions, Belgaum and Raichur vary considerably in climatic and geomorphic conditions and this difference is evident in their corresponding evapotranspiration data as fractal properties. This is depicted by the values of the generalized Hurst exponent H(q) obtained using MFDFA technique.
    Keywords: Agriculture; Evapotranspiration; MFDFA.

  • A survey on hand gesture recognition for Mobile Devices   Order a copy of this article
    by Houssem Lahiani, Mahmoud Neji 
    Abstract: As smartphones and mobile devices become ubiquitous in the daily life of Human beings, facilitating interaction with them will have a positive impact on their use. This paper presents an emerging and important area for research on Human-Machine Interaction, involving interaction with mobile devices. Hand Gesture recognition is already a quite mature field and its integration on advanced technologies like smartphones makes a survey an interesting and needed contribution. The design of interactive mobile applications should differ from traditional desktop applications. For this purpose, the paper discusses some concepts that help to understand new challenges and introduces techniques that may be useful to properly exploit these devices. Thus, researchers in the field of Human-Machine Interaction aim to develop systems able to recognize hand gestures in the air to control smartphones and portable devices. Gestures have long been seen as a method of interaction that can potentially provide more natural and intuitive ways to communicate with computers. They represent an efficient way to interact with computers. Efficient and robust hand gesture recognition techniques for mobile devices are still in development phase. The main steps associated with the design of hand gesture recognition systems are discussed in this paper. We provide in this paper a literature review of current research works and an analysis of comparative studies in this field. This paper focuses on the main steps of hand gesture recognition for mobile devices like detection, tracking and recognition. This work also gives an analysis of the existing literature on gesture recognition systems for mobile devices by classifying them under various key parameters. This paper gives also an overview of recent works done on vision-based and contact based hand gesture recognition systems for mobile devices. Used techniques and approaches are also discussed. At the end, we conclude with some reflections.
    Keywords: Mobile computing; android; gesture recognition; Human-machine interaction.

  • Denoising 1D Signal Using Wavelets   Order a copy of this article
    Abstract: Signal denoising is one of the most important areas in signal processing. In the present paper, we have denoised a 1D piecewise constant (PWC) signal corrupted by additive white Gaussian noise (AWGN) using the thresholded Haar wavelet denoising method. The central idea of the wavelet transform is the multiresolution decomposition of signals. This becomes advantageous because small objects require high resolution and low resolution is suitable for large objects. In multiresolution decomposition, an approximation component is created using a scaling function, which is often termed as a lowpass filter. Detail components are obtained using wavelet functions, which are often known as highpass filters. A series of approximations of a signal is thus obtained, which has difference in resolution among them by a factor 2. Detail components contain the difference between adjacent approximations. Proper thresholding of the transformed signal results in reduced noise because, in general, the noise component of a signal has a relatively smaller magnitude and wider bandwidth. We show that our method outperforms recently reported non-convex regularization based convex 1D total variation denoising method on PWC signals.
    Keywords: Wavelets; Thresholding; Denoising; Multiresolution analysis.

  • Packet Loss Concealment Based Estimation of Polynomial Interpolation for Improving Speech Quality in VoIP   Order a copy of this article
    by Adil Bakri, Abderrahmane Amrouche 
    Abstract: The main objective of Packet Loss Concealment (PLC) techniquesrnis to improve the speech quality in Voice over IP (VoIP). These techniquesrngenerate a synthetic speech signal to cover the missing data or the lost packetsrnin a received bit-stream. This paper is concerned with performing a new PLCrntechnique using the Estimation of Polynomial Interpolation (EPI) method,rnwhich is designed to seek the approximate function as a polynomial. This function can be used to predict the lost packet from previous packets. A two-rnstate Markov model, particularly, is used to represent the lost packets. Thernused test vectors, in the proposed PLC evaluation, are the TIMIT database.rnThus, the proposed PLC algorithm, which is provided better speech quality,rnis evaluated by the PESQ and SNR and compared to the two techniques,rnhidden Markov model (HMM)-based PLC algorithm and deep neural networkrn(DNN)-based PLC.
    Keywords: Packet loss concealment; VoIP; Speech quality; Packet loss model.

  • Palm vein recognition system based on multi-block statistical features encoding by phase response information of Nonsubsampled contourlet transform   Order a copy of this article
    by Amira Oueslatihermi 
    Abstract: In this paper, we improve our palm vein recognition system to be based on phase response information of nonsubsampled contourlet transform (NSCT). First, we localize the Region of Interest (ROI), next, we have divided the region of interest inti a non-overlapping block and we proposed an encoding method based on extracting phase response information of NSCT coefficients, then XOR pattern is applied to extract invariant from local region of the palm vein to create a palm vein template of 512 bytes. Finally, we have calculated the modified Hamming distance between templates to estimate the similarity between two palm veins filtered images. The method is tested on the CASIA Multispectral Palm-print database. The experimental results illustrate the effectiveness of this coding in two modes of biometric palm vein: 99.90 % of rank-one recognition rate and 0.19% of equal error rate in verification.
    Keywords: Palm-vein; recognition; ROI Extraction; Nonsubsampled contourlet transform; Feature Extraction; phase response information; XOR pattern; Statistical Descriptor; multi-blocks.

    by Nitish , Amit Kumar Singh 
    Abstract: From last two decade automatic detection of brain tumor in MR images is an emerging area of research in medical Science. A brain tumor is an abnormal mass of tissues in which cells grow and multiply uncontrollably, seemingly unchecked by the mechanisms that control normal cells. A brain tumor is the second leading cause of cancer-related deaths in men and the fifth leading cause among women. Diagnosis of the tumor at earlier stage is a very important part of its treatment. For detecting the brain tumor MRI and CT Scan are the most significant techniques. Despite promising techniques like MRI, CT Scan, characterization of abnormalities is still a challenging and difficult task. This paper presented an automated computer-aided brain tumor detection system which first classifies the tumorous from non-tumorous MRI images and then calculates the area of the affected region using Hysteresis thresholding technique which segments the lesion region from other part, feature extraction using Gabor filters and then SVM classifier for classification. The pursuance of the suggested system is evaluated in term of sensitivity, specificity, classification accuracy and Area under Curve (AUC) using ground truth images. The proposed system gives an average accuracy of 97%.
    Keywords: MRI; Tumor; Hysteresis thresholding; Wavelet transform; Gabor filters; Dimensionality reduction; classification; Confusion matrix; SVM; Sensitivity; Specificity; ROC; AUC.

  • Refurbishing ANN with the Aid of Adaptive Crow Search Optimization for Effectively Diagnosing Railway Wheel Condition   Order a copy of this article
    by Kota Venkateswarlu, .Venkatachalapathy V.S.K, Velmurugan K., Thiagarajan A. 
    Abstract: The research intends to diagnosis the railway wheel condition with the aid of an Artificial Neural Network (ANN). In diagnosing, ANN has been proven its convenience over manual computation in various industrial and transportation applications. The research utilizes optimization techniques for identifying appropriate hidden layers and their associated neurons to enhance the performance of ANN techniques. This configuration process includes optimization techniques like Evolutionary Algorithm (EA), Genetic Algorithm (GA), Particle Swarm Optimization (PSO), and Crow Search Optimization (CSO). Also, this research includes modified and improved conventional strategy in CSO, which urge incorporating novel strategy called Adaptive Crow Search Optimization (ACSO) to enhance the performance of ANN further. The proposed novel strategy unveils proficient performance of 99.2% accuracy, which is 1.7% greater than the conventional ANN model and an average of 0.9% greater than other contest optimization techniques utilized to configure the ANN model. The credibility of the ANN model gets increased while employs the optimization techniques in diagnosing the railway wheel condition.
    Keywords: Artificial Neural Network (ANN); Evolutionary Algorithm (EA); Genetic Algorithm (GA); Particle Swarm Optimization (PSO) and Adaptive Crow Search Optimization (ACSO).

  • An Ant Colony Optimization based Framework for the Detection of Suspicious Content and Profile from Text Corpus   Order a copy of this article
    by Asha Kumari, Balkishan  
    Abstract: The technical advancements in the field of short message communication have swiftly raised the menace of suspiciousness along with human communication means. This uncontrolled enhancement of unsolicited suspicious content has become the motivation for the researchers to work and control this peril. Suspicious content can be related to any uninvited message that can lead to rumors, fake news, spam, malicious, and threatening activities. Although there are numerous classical content-based methods for the detection of suspicious activities, these methods lack in case of short messages such as SMS and microblogs. This research work addresses the problem of suspicious content and profile identification from SMS and Twitter Microblogs. A framework based on Ant Colony Optimization is presented for the Detection of Suspicious Content and Profile (ACODSCP). In this study, one twitter Microblogs based text dataset and two SMS based text corpora are utilized. The global optimization behavior of ant colony optimization (ACO) concept incorporates to determine the suspicious activities and profiles effectively with the minimal feature set. The performance of the proposed ACODSCP system is assessed in terms of precision, recall, and f-measure. The evaluated results in terms of mentioned parameters indicate the promising results of the proposed concept in comparison with existing concepts.
    Keywords: Suspicious Content; Swarm Intelligence; Ant Colony Optimization; Short Message Service (SMS); Twitter Microblogs; Spam; Social Communication Means.

  • A dynamic configuration with a shared knowledge center for multi objective ant colony optimization algorithms   Order a copy of this article
    by Mohamed RHAZZAF, Taw?k MASROUR 
    Abstract: This paper proposes a dynamic configuration approach to the ant colony optimization algorithm configuration, applied to multi-objective optimization problems. Indeed, the inertia of the static vision of the pheromone or visibility preferences values makes our dynamic approach desired. We propose a model based on a collective knowledge center shared by the colony members, storing the best configurations based on the old colony's experiments during the learning phase for random problems. The construction of this center is based on a statistical and qualitative studies of the evaluation criteria that will be explained over the paper. Our model gives results that show a rise in quality of the outputs, as well as a proof of concept for the artificial learning approach.
    Keywords: {Swarm Intelligence; Multi Agents System; Multi-criteria Optimization; Ant Colony Optimization; Traveling salesman problems.

  • An ECC Based Authentication Protocol for Fog-IoT Enabled Smart Home Environment   Order a copy of this article
    by Bhabendu Kumar Mohanta, Debasish Jena, Srikanta Patnaik 
    Abstract: Internet of Things(IoT) enabled smart home means all the home appliances are connected to the internet.All these devices are monitored and controlled using a mobile device, iPhone, or iPad regardless of location or time constraint. In an IoT based smart home network, associated users and devices need to be recognized otherwise unauthorized access reduce the security and privacy of the smart environment. In this paper, contributions are first proposed an Elliptical curve cryptography based authentication protocol for Fog nodes and users in the IoT-based smart home environment. Secondly, the simulation for security analysis of the proposed authentication protocol using AVISPA tools has been done. Thirdly, BAN logic is used to verify that the devices in the smart home environment are achieved security goals. Finally, a comparison with some existingworks has been done and it has been seen that proposed protocol achieved better security goals in a smart home environment.
    Keywords: Authentication,IoT; Smart Home Environment,Security,Elliptic curve cryptography.

  • Model to enhance Security posture of IoT devices and components with Private APN   Order a copy of this article
    by Akashdeep Bhardwaj, Bharat Bhushan 
    Abstract: Internet is a dangerous place for the countless devices connected to the Internet. These devices or Things on Internet connect from remote locations at the network edge provide various services that makes our lives easy. With little or no security protection, IoT vendors use low-grade virtual private network solutions for devices linked on public, insecure circuits, specifically the Internet. This enables protection of sensitive data in transit when logs traverse from sensors and edge devices to business systems and cloud infrastructure. However, the rise of new vulnerabilities and attack methodologies more often than not, end up compromising these devices. The reality is such devices face an unprecedented risk of exposure to Botnets, Malware, man in the middle, remote execution and session hijacking cyber-attacks. The impact not just disrupts the functioning of IoT deployment; it can disable services, turn into Bots but also, damages the business reputation and brand value. The authors strongly believe that industrial, commercial business operations and end user devices should never be at risk or have a chance of exposure to cyber-attacks. This paper presents a secure IoT architecture using private Access Point Name model. The authors discuss the proposed model presenting superior security, visibility and regulation over data traffic flows originating from the IoT device to the cloud infrastructure. This architecture has been successfully tested and implemented for one of the utility companies in India for real time monitoring of their critical infrastructure elements.
    Keywords: IoT; IoT Security; Private APN; Mobile; Private IP.

  • Deep Learning Techniques for Classiffication of Brain MRI   Order a copy of this article
    by Imayanmosha Wahlang, Pallabi Sharma, Sugata Sanyal, Goutam Saha, Arnab Kumar Maji 
    Abstract: Several brain diseases are becoming a threat to the livelihood of people. One such problem is the presence of a brain tumor. A brain tumor can be benign or malignant. It is dangerous if it is a malignant or secondary tumor (metastasis). Therefore, there is a need to detect the presence of tumors at the earliest stage as possible. Using an automated method for brain tumor detection can be a solution to medical expertise as a biopsy can be excluded if early detection could be possible. Classification helps in the prediction of the type of image and type of tumor. In this paper, three stages are involved. In the first stage, the classification of brain MR images into normal (tumor) or abnormal (nontumor) images using ConvNet, Lenet, ResNet, and Densenet has been analyzed. In the second stage, architectures like Lenet and Alexnet are used in the prediction of the type of tumor namely metastasis, glioma, and meningioma. And lastly, using U-Net and AlexNet, classification into high grade glioma and low grade glioma was done.
    Keywords: Convolutional Neural Network (CNN); DenseNet; ResNet; AlexNet; U-Net.

  • An Innovative AAL system based on Neural Networks and IoT-aware technologies to improve the Elderly People Life Quality   Order a copy of this article
    by Benito Taccardi, Piercosimo Rametta, Pierluigi Carcagnì, Marco Leo, Cosimo Distante, Luigi Patrono 
    Abstract: The growing average age of the population and the continuous change in the habits of families increasingly generates the condition of elderly people living alone. In addition, as the rate of cognitive diseases of all degrees increase with age, making it necessary, sooner or later, some kind of support to the elders in daily activities such as home maintenance and shopping at supermarket. From this perspective, providing new smart services could improve the elderly people life quality, and it could also indirectly benefit the formal and informal caregivers, who are today delegated to support the elderly daily routine. Ambient Assisted Living systems play a very important role in addressing these social issues which also have a significant economic impact on families and governments. In particular, the combination of Information and Communication Technologies enabling Internet of Things and Artificial Intelligence paradigms is a very promising research topic to effectively address the daily challenges that impact on people affected by neurodegenerative diseases. This work presents a smart system able to support the elderly people in managing goods purchase and filling in a shopping list. It was made possible using low cost technologies such as smart boards, mobile applications and a few sensors combined with a powerful neural network aimed to recognize products. The system is unobtrusive and intuitive. The proposed system, whose main services were developed in the cloud, has been validated both from a functional point of view through a proof-of-concept and quantitatively by a performance analysis of its components.
    Keywords: Ambient Assisted Living; Internet of Things; Neural Network.

Special Issue on: Recent Advancements in Autonomous Devices for Real-World Applications

  • Improving Network Lifetime and Speed for 6LoWPAN Networks Using Machine Learning   Order a copy of this article
    by Shubhangi Kharche, Sanjay Pawar 
    Abstract: Wireless communication networks have an inherent optimization issue of effectively routing data between nodes. This issue is multi-objective in nature, and covers optimization of routing speed, the network lifetime, packet delivery ratio and overall network throughput. In this paper, a machine learning (ML) based algorithm is proposed for minimizing the network delay and increasing network lifetime for 6LoWPAN networks based on RPL routing. The ML based approach is compared with normal RPL routing in order to check the performance of the system when compared to recent routing protocols. It is observed that the proposed machine learning based approach reduces the network delay by more than 20% and improves the network lifetime by more than 25% when compared to RPL based 6LoWPAN networks. The machine learning approach also takes into account the link quality between the nodes, thereby improving the overall QoS of the communication system by selecting paths with minimal delay, minimal energy consumption and maximum link quality.
    Keywords: Machine learning; 6LoWPAN; RPL; Feedback mechanism; Artificial Intelligence;.
    DOI: 10.1504/IJISTA.2020.10027027
    by D. Haripriya, S. Ramyasree 
    Abstract: In airports, there is a possibility of individual entry using fake tickets to involve in crime activities. The baggage mislaid and delay of flight service may lead to dissatisfaction among the passengers. The fake ticket identification, avoiding mislaid baggage and flight service delay notification are focused in this paper as existing systems are not attending all the issues together. An automated system is designed to increase the customer facilities in terms of flight delay notification, baggage mislaid alert and ticket bookings. The Baggage are handled using RFID tags whereas fake tickets are identified through QR scanning. The flight delays are predicted using machine learning based linear regression technique with 97% accuracy. Mobile app is developed for ticket booking to improve the passenger facilitation to a greater extent and web app is developed for airport management to verify the fake ticket. The proposed airport management system makes the air travel more customer friendly with high security.
    Keywords: Linear regression technique; RFID; QR code; Machine learning; Airport management.

  • Design of BTI Sensor Based Improved SRAM for Mobile Computing Applications   Order a copy of this article
    by Kumar Neeraj, J.K. Das, Hari Shanker Srivastava 
    Abstract: Reliability of electronic components is the major concern as the CMOS technology is scaled down especially in mobile computing applications of MPEG video processor design. Scaling CMOS technology leads to increase in power density per unit area in an exponentially manner. BTI is one of the serious problems in SRAM cell design at low technology level. In this paper a detection technique is proposed which detects the BTI effect on SRAM using SNM calculation. The proposed prototype is used to detect faults during read and write cycle of aged SRAM, which affects the reliability of the circuit. The diagnostics of fault is done by detection of BTI effect on SRAM using static noise margin (SNM) calculation. The circuit design on CMOS technology is carried out using HSPICE simulator in cadence.
    Keywords: Static RAM cell; CMOS technology; Bias Temperature Instability (BTI); Technology Scaling; Static Noise Margin (SNM).

  • Hybrid Genetic Algorithm in Partial Transmit Sequence to Improve OFDM   Order a copy of this article
    by Ravikumar Polukonda 
    Abstract: The work takes into consideration use of a technique known as the Partial Transmit Sequence (PTS) for the diminution of a Peak-to-Average-Power-Ratio (PAPR) of the Orthogonal Frequency Division Multiplexing (OFDM) signal in that of the wireless communication systems. The conventional scheme of the PTS uses extensive random search to explore the combinations of the phase vectors to improve PAPR, but this elevates the complexity of the search and also exponentially increase the number of phase vectors that demands high computational cost and compromise on accuracy. Here in this work, there is a suboptimal algorithm used for the phase optimization which is based on an enhanced version of the Genetic Algorithm (GA) which is applied for exploring optimal combination of the phase vectors providing an enhanced performance which is compared with the currently active algorithms like the Particle Swarm Optimization (PSO) algorithm or the Bacterial Foraging Optimization (BFO) algorithm. This hybrid GA will enhance the accuracy and the rate of convergence of all conventional algorithms with only a few parameters that required adjustment. The results of simulation proved that the hybrid GA-PSO and the GA-BFO based PTS algorithm was able to attain a reasonable reduction in the PAPR by employing a simple network structure on being compared to the other conventional algorithms.
    Keywords: Orthogonal Frequency Division Multiplexing (OFDM); rnPartial Transmit Sequence (PTS); rnPeak to Average Power Ratio (PAPR); rnGenetic Algorithm (GA); rnParticle Swarm Optimization (PSO) and rnBacterial Foraging Optimization (BFO).

    by Sekhar Babu, P.V. Naganjaneyulu, K. Satya Prasad 
    Abstract: The modern communication systemsface several issues that are quite challenging, more so in those involving smart antennas such as the antenna array beam. The antennas help the array in improving reception of that of a signal thereby improving the Signal-to-Interference Ratio (SIR). In the case of some of these coherent Multiple Input Multiple Output (MIMO) radar an optimal target signal and its processing can be achieved for any of the waveforms that are transmitted or the pattern of the radiation beam, thus making a transmit beam to be forming through a waveform design without bringing down the performance of target detection. There are several other techniques of that of Adaptive Beam Forming (ABF) which were proposed until now for optimizing the ability of steering of the array with regard to that of the main load and nulls, thus improving the Signal-to-Interference-Plus-Noise Ratio (SINR). In this work, an optimized neural network based on Bacterial Foraging Optimization Algorithm (BFOA) along with a Greedy Algorithm and a Tabu Search (TS) algorithm for the channel selection having an adaptive beam which is formed by using a steering vector is proposed. In the proposed hybrid, adaptive seed dispersion will make the BFOA-TS be able to converge faster when compared to a BFOA-Greedy. This type of behaviour has been duly verified by means of an application of the BFOA-TS and the BFOA-Greedy on the test functions which are well-known. The results of this experiment proved that the method proposed was able to achieve a better performance compared to that of the others.
    Keywords: Beam Forming; Smart Antenna; Multiple Input Multiple Output (MIMO) System; Channel Selection; Bacterial Foraging Optimization Algorithm (BFOA); Greedy algorithm and Tabu Search (TS).

  • Multiple data cost based stereo matching method to generate dense disparity maps from images under radiometric variations.   Order a copy of this article
    by Akhil Appu Shetty, V.I. George, C. Gurudas Nayak, Raviraj Shetty 
    Abstract: Stereo matching algorithms are capable of providing dense 3D information of the environment, through two images taken simultaneously from a pair of cameras placed horizontally and parallel to each other. This 3D information is generated in the form of disparity maps. The depth of the objects in the images can be extracted from the disparity map through the equation (b*f/d), where (b) and (f) indicate the baseline and focal length of the cameras, while (d) indicates the disparity obtained through the stereo matching method. Obtaining an accurate disparity map from a stereo image pair is not only a challenging task but also computationally expensive as we have to search for similar pixels in the reference and target images. In addition to this, if we take into consideration the environmental effect like difference in illumination and exposure conditions, then the difficulty of the task in hand increases drastically. The authors, in this research work, try to overcome this problem by combining multiple stereo cost functions in the form of a linear equation. Moreover, to reduce the computation time, a segmentation based cost aggregation method is followed in an attempt to produce an accurate disparity map even in the presence of radiometric variations in the images. The radiometric condition in the target image (left image) is fixed to (11), which indicates the illumination and exposure condition for the image. The radiometric conditions (exposure and illumination) for the target image (right image) are varied from (10) to (22) indicating a large variation in both illumination and exposure. The performance of the proposed algorithm is observed while varying the relationship parameter ? between the cost functions and the number of segments the images are broken into. Different image pairs with varying radiometric conditions used in this research work were obtained from the Middlebury stereo dataset.
    Keywords: Stereo matching;Middlebury stereo dataset;SLIC segmentation;disparity maps.

  • Neural Network Decoder for (7, 4) Hamming Code   Order a copy of this article
    by Aldrin Vaz, C. Gurudas Nayak, Dayananda Nayak 
    Abstract: To ensure the accuracy, integrity and fault-tolerance in the data to be transmitted, Error Correcting Codes (ECC) are used. To decode the received data and correct the errors, different techniques have been developed. In this paper, Artificial Neural Networks (ANN) have been used instead of traditional error correcting techniques, because of their real-time operation, self-organization and adaptive learning and to project what will most likely happen on the analogy of human brain. A decoding approach based on Back propagation Algorithm for feed forward ANN has been simulated using MATLAB for (7, 4) Hamming Code. The designed ANN is trained on all possible combination of code words such that it can detect and correct up to 1-bit error. The synaptic weights are updated during each training cycle of the network. The simulation results show that the proposed technique is correctly able to detect and correct 1-bit error in the received data.
    Keywords: Artificial Neural Network; Back Propagation Algorithm; Error Correcting Code; Hamming Code.

  • Implementation and Evaluation of a Trust Model with Data Integrity Based Scheduling in Cloud   Order a copy of this article
    by A.V.H. Sai Prasad, G.V.S. Raj Kumar 
    Abstract: Cloud computing is a model which provide services to the users as per their demand using the infrastructure belonging to various systems on a cloud that can be accessed using internet. By means of a simple and easy to understand Graphical User Interface (GUI) or Applications Programming Interface (API), the cloud computer can hide the inherent complexity of the infrastructure and associated fine details. Jobs in cloud computing are created based on priorities. By assigning the job to a suitable resource until a valid or optimal schedule is reached, the job that has the highest associated priority is first executed. This has introduced several challenges and also risks, from a security point of view and this also decreases the efficacy of the conventional protective approaches. To address the cloud challenges, data integrity plays a vital role and provides accuracy and consistence of the stored data without modified. This work suggests a trust based Min-Min and Max-Min algorithm as several unknown parties or enterprises provide various services. Max-Min schedules the longer task first when the instance is being scheduled and the Min-Min allows the task having the shortest computation time to take precedence. The Min-Min and Max-Min is based on the completion time and execution time only. Hence the trust factor is added to the conventional Min-Min and Max-Min algorithm along with the completion and execution time to support security and integrity in cloud. The specific characteristics of security within the cloud environment are assured using Trusted Third Party (TTP). For making the confidentiality, integrity and the authentication of the data and communication that are involved, cryptography has to be used. A horizontal level of service is presented by the solution which is made available to all the entities that are involved, this solution makes use of security mesh inside which the required trust is maintained.
    Keywords: Cloud Computing; \r\nScheduling; \r\nSecurity; \r\nMin-Min Algorithm; \r\nMax-Min Algorithm and \r\nTrust Model.

Special Issue on: Recent Advancements in Artificial Intelligence Systems and Their Applications

  • An Improved Henry Gas Solubility Optimization-based Feature Selection Approach for Histological Image Taxonomy   Order a copy of this article
    by Susheela Vishnoi, Ajit Kumar Jain 
    Abstract: Classification of histopathological images is one of the important areas of research in the field of medical imaging. However, the complexities available in histopathological images make the classification process difficult. For such complex images, selection of prominent features for image classification is also a challenging task and is still an open research area for computer vision researchers. Therefore, an effective method for the selection of prominent features of images has been introduced in this work. For the same, an improved henry gas solubility optimization has been introduced in which a new position update equation has been used to balance the global and local search. The selected features are then input to classifiers to identify histopathological images. For the performance analysis of improved henry gas solubility optimization, 23 benchmark functions are used. The proposed feature selection method has been analysed over two datasets, namely breast cancer cell dataset and ICIAR grand challenge dataset. The proposed feature selection method eliminates the maximum 60% average features from both the datasets. To validate usefulness of selected features, results of different classifiers are compared. Experimental results show that the presented method outperforms other methods.
    Keywords: Feature selection; Henry gas solubility optimization algorithm; Histology images; Image classification.

  • Clustering based hybrid resampling techniques for social lending data   Order a copy of this article
    by Pankaj Kumar Jadwal, Sonal Jain, Basant Agarwal 
    Abstract: Social lending is the most popular and emerging loan disbursement process where an individual can act as a borrower or lender. Credit risk evaluation of the borrowers in an effective way is a crucial task, especially in social lending, where chances of being defaulted are more than the traditional models. Social lending datasets are imbalanced in nature due to the low number of defaulters than successful borrowers. Machine learning models based on such datasets contain biasing towards the class representing the majority of samples (Majority class). Therefore, the probability of accurate prediction of minority class samples is decreased due to biasing towards majority class samples. In this paper, we propose a novel Clustering based Hybrid Sampling algorithm (CBHS), where multi-phase K-means clustering is applied on the minority class samples to perform oversampling (KMBOS), and Fuzzy c -means clustering is used on the majority class samples to perform undersampling (FCBU). Experiments results show that KMBOS and FCBU algorithms outperform state of the art techniques of oversampling and undersampling.
    Keywords: Credit risk; Clustering; Classification; Hybrid model; Oversampling; Undersampling; Class Imbalance.

    by Deepa Mathur, Deepak Bhatia, Prashant K. Jamwal, Shahid Hussain, Mergen H. Ghayesh 
    Abstract: This paper aims to develop an adaptive control strategy for a fuzzy logic system to be implemented in a robotic gait training orthosis. The robotic orthosis has a bio-inspired design which has evolved after a careful study of the biomechanics of human gait. Ambulatory requirements of the robot have been achieved by employing light weight but powerful pneumatic muscle actuators (PMA). The sagittal plane rotations achieved by the robotic orthosis at the hip and knee are achieved by implementing a Pneumatic Muscle Actuator (PMA) for actuation. The PMA of the Robotic orthosis was controlled by a fuzzy logic controller based on the Mamdani inference in order to obtain the necessary rotational degrees of freedom. To cope with the nonlinear behavior of PMA towards external disturbances, a second instance of fuzzy based controller has been developed. The PMA is infamous for its time dependent characteristics hence an adaptive control mechanism has been introduced in an attempt to compensate for it. Healthy subjects were employed for performing experiments in order to understand and estimate the performance of the adaptive fuzzy logic controller as well as the entire adaptive robotic design. The human-robot interaction was mainly maintained passive-active, while the paths used for the robot were strictly predefined trajectories which were usually employed by physical therapists during rehabilitation sessions.
    Keywords: Adaptive fuzzy logic control; robotic orthosis; gait training; PMA; neurological impairments.