Forthcoming articles


International Journal of Intelligent Systems Technologies and Applications


These articles have been peer-reviewed and accepted for publication in IJISTA, but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.


Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.


Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.


Articles marked with this Open Access icon are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.


Register for our alerting service, which notifies you by email when new issues of IJISTA are published online.


We also offer RSS feeds which provide timely updates of tables of contents, newly published articles and calls for papers.


International Journal of Intelligent Systems Technologies and Applications (45 papers in press)


Regular Issues


  • Support Vector Machine based Fault Detection and Diagnosis for HVAC Systems   Order a copy of this article
    by Jiaming Li 
    Abstract: Various faults occurred in the Heating, Ventilation and Air-Conditioning (HVAC) systems usually lead to more energy consumption and worse thermal comfort inevitably. This paper presents a feasible and valid solution of HVAC fault detection and diagnosis problem based on statistical machine learning technology. It learns the consistent nature of different types of faults of HVAC operation based on Support Vector Machine (SVM), and then identify types of fault in all subsystems using the statistical relationships between groups of measurements. In order to speed up the learning process, Principle Component Analysis (PCA) has been applied to compress the training data. Our approach models the dynamical sub-systems and sequence data in HVAC system. The learnt models can then be used for automatic fault detection and diagnosis. The approach has been tested on commercial HVAC systems. It had successfully detected and identified a number of typical AHU faults.
    Keywords: Fault detection and diagnosis; FDD; Machine learning; SVM; HVAC system; Principle Component Analysis; PCA.

  • Efficient Blind Nonparametric Dependent Signal Extraction Algorithm for Determined and Underdetermined Mixtures   Order a copy of this article
    by Fasong WANG 
    Abstract: Blind extraction or separation statistically independent source signals from linear mixtures have been well studied in the last two decades by searching for local extrema of certain objective functions, such as non-Gaussianity (NG) measure. Blind source extraction (BSE) algorithm for extracting statistically dependent source signals from underdetermined and determined linear mixtures is derived using nonparametric NG measure in this paper. After showing that maximization of the NG measure can also separate or extract the statistically weak dependent source signals, the nonparametric NG measure is defined by statistical distances between different distributions of separated signals based on cumulative density function (CDF) instead of traditional probability density function (PDF), which can be estimated by the quantiles and order statistics (OS) using the norm efficiently. The nonparametric NG measure is optimized by a deflation procedure to extract or separate the dependent source signals. Simulation results for synthesis and real world data show that the proposed nonparametric extraction algorithm can extract the desired dependent source signals and yield ideal performance.
    Keywords: blind source separation; non-Gaussianity measure; independent component analysis; probability density function; dependent component analysis; underdetermined blind source extraction.

  • Indic Script Identification from Handwritten Document Images   Order a copy of this article
    by Pawan Kumar Singh, Ram Sarkar, Mita Nasipuri 
    Abstract: Script identification plays an important role in document image processing especially for multilingual environment. This paper hires two conventional textural methods for the recognition of the scripts of the handwritten documents inscribed in different Indic scripts. The first method extracts the well-known Haralick features from Spatial Gray-Level Dependence Matrix (SGLDM) and the second method computes the fractal dimension by using Segmentation-Based Fractal Texture Analysis (SFTA). Finally, a 104-element feature vector is constructed from the features designed by these two methods. The proposed technique is then evaluated on a total dataset comprising of 360 handwritten document pages written in 12 Indian official scripts namely, Bangla, Devanagari, Gujarati, Gurumukhi, Kannada, Malayalam, Manipuri, Oriya, Tamil, Telugu, Urdu and Roman. Experimentations using multiple classifiers reveal that Multi Layer Perceptron (MLP) shows the highest identification accuracy of 96.94%. Encouraging outcome confirms the efficacy of customary textural features to handwritten Indic script identification.
    Keywords: Script Identification; Handwritten Indic documents; Textural Features; Spatial Gray-Level Dependence Matrix; Segmentation-Based Fractal Texture Analysis; Statistical Significance Tests.

  • Combining RSS-SVM with Genetic Algorithm for Arabic Opinions Analysis   Order a copy of this article
    by Amel ZIANI, Nabiha Azizi, Djamel Zenakhra, Soraya Cheriguene 
    Abstract: The Arabic language has drawn the attention of researchers in data mining due to its large-scale users, but it presents challenges because of its richness and complex morphology. That explains the growing importance of the Arabic sentiment analysis and precisely the Arabic opinion detection and classification areas. Thus, the most accurate classification technique used in this area which proven by several previous works is the Support Vector Machine classifier (SVM). This last, is able to increase the rates in opinion mining but with use of very small number of features. Hence, reducing features vector can alternate the system performance by deleting some pertinent ones. To overcome these two constraints (features vector size and considerate all extracted primitives), our idea is to use Random Sub Space algorithm (RSS) at first stage to generate several features vectors with limited size; and to replace the decision tree base classifier of RSS with more accurate classifier which is the SVM. Therefore, each one of the generated features subsets will be the entry of an individual SVM classifier. Despite the obtained high results of this proposed approach, another proposition was implemented in order to enhance the previous algorithm by using the genetic algorithm as subset features generator based on correlation criteria to eliminate the random choice used by RSS and to prevent the use of incoherent features subsets. For that, features extraction stage is considered the main step in opinion mining process. Wherefore, to extract and to calculate the best informative linguistic and statistic features, we struggled with the lack of labeled Arabic dataset. So, a translated SentiWordNet dataset has been used to overcome this problem and to ameliorate the proposed system performance. The experiments results applied on one thousand (1000) labeled reviews collected from Arabic Algerian newspapers and manually annotated are very promising.
    Keywords: Arabic opinion mining; SentiWordNet; Machine learning; SVM (Support Vector Machine); RSS (Random Sub Space); GA (Genetic Algorithm).

  • Facial beauty analysis by age and gender   Order a copy of this article
    by Manal El Rhazi, Arsalane Zarghili, Aicha Majda, Anissa Bouzalmat, Ayat Allah Oufkir 
    Abstract: The face is the first source of information that inspires the attractiveness of a human being, for this reason; several studies were conducted in the aesthetic medicine and the image processing to analyze the aesthetic quality of an adult human face. This paper proposes an automatic procedure for the analysis of facial beauty. First, we detect the face zone on an image and its features areas, then we present our novel method to extract features corners, and finally we analyze the facial aesthetic quality. Experimental results show that our method can extract the features corners accurately for the majority of faces presented in the ECVP and FEI images databases, and that there exist a difference in the facial beauty analysis by gender and age, due to anatomic differences in specific facial areas between the categories.
    Keywords: Facial features; Plastic surgery; Facial attractiveness; Facial beauty analysis; Image processing; Image analysis.

  • Enhancement based Background Separation Techniques for Fruit Grading and Sorting   Order a copy of this article
    by Jasmeen Gill 
    Abstract: Image processing plays a remarkable role in the automation of fruit grading and sorting. While grading the fruit, accurate extraction of fruit object from the image (background separation) is the chief concern. For extraction of fruit, appropriate segmentation technique is employed; and to accomplish it accurately, enhancement must be performed prior to segmentation. However, majority of the researchers emphasized over fruit segmentation alone. This communication is intended to show the potential of enhancement techniques when combined with fruit image segmentation. Besides, it presents a comparative analysis of enhancement based background separation techniques for fruit grading and sorting. For this purpose, four main techniques, namely, Contrast limited adaptive histogram equalization (CLAHE) method, Gaussian filter, Median filter and Wiener filter were utilized for enhancement and Basic Global thresholding, Adaptive thresholding, Otsu thresholding and Otsu-HSV thresholding were applied for segmentation. 16 sub-models were developed by combining each enhancement method with every segmentation technique. Afterwards, the image quality of the sub-models was validated using quantitative as well as qualitative analyses. Test results demonstrate that CLAHE/Otsu-HSV model outperformed the others for fruit grading and sorting.
    Keywords: Digital image processing; Segmentation; Enhancement; Fruit grading and sorting; Background separation; Otsu-HSV segmentation.

  • Soft Neural Network based Block Chain Risk Estimation   Order a copy of this article
    by Ganglong Duan, Wenxiu Hu, Yu Tian 
    Abstract: Financial risk refers to the uncertainty caused by the change of the economic and financial conditions. As a kind of economic phenomenon, the financial risk is objective and can not be eliminated. At present, there are still some imperfect aspects in the research of financial risk assessment. In order to achieve the purpose of comprehensive evaluation of financial risks, the paper analyses the methodology of soft computing and neural networks. The basic function of financial risk monitoring and evaluation system is to forecast the trend of financial activities and risk status, and this is also the fundamental function and objectives of the assessment system. We use BP neural network theory to establish the logistics finance risk evaluation model, using BP neural network structure and training principles to train sample data. The soft computing method is based on the factors of uncertainty and irrationality, which breaks through the limitation of traditional hard computing. There is a consistency between the fuzzy thinking principle of soft computing method and the attribute and structure of the objective world, therefore, soft computing can be used in field of financial risk assessment.
    Keywords: Block chain; financial risk; assessment; neural network; soft computing.

  • Data Flow Tracking based Block Chain Modelling   Order a copy of this article
    by Ganglong Duan, Wenxiu Hu, Yu Tian 
    Abstract: This article carries on the analysis to the recent global financial industry focus block chain, introduces the concept and scope of the technology, and to the financial payment service as an example, analyzes the block chain technology to the center, to trust four characteristics, collective maintenance and safety database etc.. In this area chain based technology in the financial field of application and Research on the current situation, the international payment as an example, analyzes the block chain payment mode and the traditional mode of payment differences, and that the international financial industry attention block chain technology, nature is to build a flat global integrated settlement system. Finally, we summarize the trend of the development of block chain technology innovation, and put forward some key issues that should be paid attention to. Block chain is a kind of computing paradigm to the center of the infrastructure and distributed with new bitcoin digital encryption currency popularity gradually rise, at present it has attracted great attention and concern of government departments, financial institutions, enterprises and capital market. The block chain technology has decentralized, time series data, the collective maintenance, programmable and safe and reliable characteristics, especially suitable for construction of the monetary system, the financial system and macro social system programming. The rapid development of the financial chain block technology applications such as digital currency, smart contracts cause new financial risk, which brings a series of challenges to the existing financial supervision system in China.
    Keywords: data stream; tracking algorithm; finance; block chain; technical framework.

    by K.J. Kavitha 
    Abstract: Securing the medical images, to make it tamper free is a terribly difficult task. This challenge is with efficiency handled with the assistance, digital watermarking techniques. With the assistance of this growing technology we are able to evaluate validation, dependability, privacy and integrity of the medical images. Several algorithms are enforced on this technology. The Digital Watermarking (DWM) is implemented in two main domains: transform & spatial. The DWM is mostly implemented using the transform techniques such as Singular Valued Decomposition (SVD), Discrete Cosine Transform (DCT), Discrete Wavelet Transform (DWT), combination of DCT and DWT and also with the combination of DWT and SVD. These days the work is extended with Integer Wavelet Transform (IWT). One of the foremost challenges in these technologies is information embedding capability or we say; number of bits hidden in the cover information. This parameter is considered for evaluation of the system since as we know that the more number of bits; the distortion of the medical information is also more and vice versa. The distortion of the information is strictly avoided within the case of medical and military applications. To beat this downside of the technology, these days research work goes on to implement digital watermarking technique with less information embedding capability. One of the possible ways to reduce the number of bits in information is to use Quick /fast response code (QR Code). The QR code consumes a less space compared to the other existing formats available such as barcode. In this paper Associate in nursing approach is planned to implement the digital watermarking technique for the medical images that includes the following techniques: Integer wavelets Transform (IWT), Bit plane methodology and QR code. For the proposed system, the watermarked image is evaluated against a number of the parameters to grasp the potency of technique being utilized. The experiment is dispensed for 2 totally different bit planes and also the results are compared to point out embedding within which of the bit plane range ends up in additional potency and eventually the conclusion is formed.
    Keywords: Watermarking; DWT; QR Code.

  • Automatic Identification of Rhetorical Relations Among Intra-sentence Discourse Segments in Arabic   Order a copy of this article
    by Samira Lagrini, Nabiha Azizi, Mohammed Redjimi, Montheer Al Dwairi 
    Abstract: Identifying discourse relations, whether implicit or explicit, has seen renewed interest and remains an open challenge. We present the first model that automatically identify both explicit and implicit rhetorical relations among intra-sentence discourse segments in Arabic text. We build a large discourse annotated corpora following the rhetorical structure theory framework. Our list of rhetorical relations is organized into three level hierarchy of 23 fine-grained relations, grouped into seven classes. To automatically learn these relations, we evaluate and reuse features from literature, and contribute three additional features: accusative of purpose, specific connectives and the number of antonym words. We perform experiments on identifying fine-grained and coarse-grained relations. The results show that compared with all the baselines, our model achieves the best performance in most cases, with an accuracy of 91.05%.
    Keywords: Discourse relations; Rhetorical structure theory; Arabic language.

  • Unsupervised Generation of Arabic Words   Order a copy of this article
    by Ahmed Khorsi, Abeer Alsheddi 
    Abstract: Automated word generation might be seen as the reverse process of morphology learning. The aim is to automatically coin valid words in the targeted language. As many other challenges in the field of Natural Language Processing (NLP), the building of the generation engine might be carried out using a supervised or unsupervised approach. The former requires a clean learning data set of a decent size whereas the later needs no more than a plain text. Nonetheless, the unsupervised approaches are usually blamed for their low accuracy. The present article reports the results of an investigation on a context-free generation of classical Arabic words. Unsupervised and relatively simple, The proposed approach reached easily an accuracy of 90%.
    Keywords: Arabic language; classical vocabulary; computational linguistics; corpus expansion; linguistic corpora; morphology learning; natural language processing; unsupervised learning; statistical linguistics; word generation.

    by Subhashree Choudhury, Pravat Kumar Rout 
    Abstract: The Photovoltaic (PV) based distribution generation system has a nonlinear power characteristic curve under random variation in solar irradiance, ambient temperature and electric load. As a result, for the accurate detection and tracking of the maximum power points (MPPs),it is necessary to design an optimal controller with dynamic control capability. Solution to the above issue, this paper presents an intelligent Mamdani based Fuzzy Logic Controller (MFLC) for maximum power point tracking (MPPT) of a PV system. Different test cases with respect to different possible load and irradiance variations in grid connected mode of operation are investigated. To confirm the power quality indices within IEEE standards specification, Fast Fourier Transform (FFT) analysis of voltage and current at the point of common coupling has been done. A detailed comparison has been made in between PV without MPPT, with Incremental Conductance and proposed Fuzzy Logic Control (FLC). The results show an enhance efficiency of energy production from PV and reflects the effectiveness of the proposed scheme justifying its real time application.
    Keywords: Photovoltaic (PV) array; Maximum power point Tracking (MPPT); dc-dc boost converter; Incremental Conductance (IC); Mamdani based fuzzy logic controller (MFLC); Fast Fourier Transform (FFT).

    by Jahnavi Reddy 
    Abstract: Internet based news documents are an important source of information transmission. Large numbers of news documents from various news wire sources are available on the internet. It is almost impossible to view all the news documents generated as a result of a search performed by a user. rnTerm weighting is a useful technique that extracts important features from textual documents, thereby providing a basis for different Text Mining approaches. While several term weighting algorithms, based on manifold statistical measures have been proposed in the past, they are inaccurate in extracting salient terms from internet based digitized news documents. rnThe objective of this work is to study the existing term weighting algorithms for feature extraction and to develop an efficient term weighting algorithm for mining salient features from internet based newswire sources. TF*PDF (Term Frequency * Proportional Document Frequency) is the most popular term weighting algorithm which extracts influential features from news archives. TF*PDF satisfies the basic property of the features in news documents i.e., frequency and thus increases the accuracy when compared to other term weighing algorithms such as Binary, TF (Term Frequency), TF-IDF (Term Frequency-Inverse Document Frequency) and its variants. However, only frequency property is not sufficient for salient topic extraction. To overcome that problem, this paper presents an innovative and effective term weighting algorithm that considers Position, Scattering and Topicality along with Frequency for extracting salient events. Frequency considers the number of occurrences of a term; Position focuses on location of the term; Scattering focuses on the distribution of a term in the entire document. Topicality is the variation in the frequency of usage of a term over a period of time. Experimental evaluation shows that the proposed term weighting algorithm performs better than the existing term weighting algorithms in terms of Coverage Rate.
    Keywords: Term Weighting; TF*PDF; FPST.

  • Neural Network Based Adaptive Selection CFAR for Radar Target Detection in Various Environments   Order a copy of this article
    by Budiman Putra Asmaur Rohman, Dayat Kurniawan 
    Abstract: Constant False Alarm Rate (CFAR), a target detection method commonly used in the radar systems, has an inconsistent performance against various environments. For improving the radar detectability, this paper proposes a novel scheme of radar target detection using neural network based adaptive selection CFAR. The proposed method employs Cell-Averaging, Ordered-Statistic, Greatest-of and Smallest-of CFAR thresholds as the basis of references. The pattern of those threshold values combined with the Cell Under Test signal value will be identified and classified by the neural network to compute the raw threshold. Then, the final threshold is selected depending on the nearest value between raw and four referenced CFARs. The performance of the proposed method is examined against three possible cases of the radar systems including homogeneous background, multiple targets and clutter boundary. The result of this research shows that the proposed method outperforms the classical CFARs due to the adaptive selection algorithm can select properly among referenced CFARs against the given cases particularly in the homogeneous and multiple target environments.
    Keywords: neural network; adaptive selection; CFAR; radar; target detection.

  • Local Search-Based Recommender System for Computing the Similarity Matrix   Order a copy of this article
    by Yousef Kilani, Ayoub Alsarhan, Mohammad Bsoul, Subhieh El-Salhi 
    Abstract: Recommender systems reduce the users' effort in finding their favourite items among a great number of items. In collaborative-based RSs, there are different similarity measures to compute the similarity values between every two users or two items. These measures include: genetic algorithms, Pearson, and Cosine-based similarity techniques. The number of items and personal attributes (e.g environment, sex, job, religion, age, county, education, etc.) that are used by the similarity metric algorithms are increasing significantly which makes the recommendation task more difficult. In our project, we introduce a new RS that uses the local search algorithms to compute the similarity matrix. As far as we know we have not found any work in the RS literature that uses local search algorithms techniques. We name the new recommender system, LSRS. We consider part of the dataset as training data (e.g. 80%) in order to calculate the similarity between every two users by LSRS. The remaining dataset is the testing data (e.g. 20%). LSRS finds the similarity among the users. It initializes the similarity value between every two users to a random value between 0-1. It then uses local search to adjust this value by training the recommender system using the training data. We experimentally show that LSRS computes the similarity matrix and outperforms the other techniques like the Pearson correlation and cosine similarity and some of the recent genetic-based recommender systems.
    Keywords: Collaborative filtering-based recommender systems; similarity matrix; Recommender Systems; local search algorithms; similarity measures.

  • Similarity searching in ligand based virtual screening using different fingerprints and different similarity coefficients   Order a copy of this article
    Abstract: Similarity searching plays an increasingly important role in virtual screening. Its a screening technique that works by comparing the features of the target compound with the features of each compound in the database of compounds. This comparison can be described in three steps: The first step involves the representation of the target compound and the database compounds with an equivalent representation, which is set of binary elements describing the presence or the absence of attributes of compounds (fingerprint). The second step uses similarity coefficient to calculate the score of similarity between two compounds representation. The third step is to rank the database compounds in appropriate order of the similarity score, in order to determine the actives compounds. Many approaches and techniques have been introduced in literature to enhancing and improving similarity based virtual screening. In this work, our primary interests are to investigate the effect of using different combinations of fingerprint and similarity coefficient in ligand-based virtual screening (LBVS). We use in this investigation the MDDR (Drug Data Repot database) to evaluate the different combinations descriptor-coefficient. Some obtained results of combinations with some coefficients demonstrate superiority in performances to these obtained in combination with Tanimoto coefficient.
    Keywords: Ligand Based; Virtual Screening; Similarity Searching; Similarity Coefficients; Molecular Descriptors; Fingerprint; Drug Discovery.

  • Collaborative Approach to Secure Agents in Ubiquitous Healthcare Systems   Order a copy of this article
    by Nardjes BOUCHEMAL 
    Abstract: In sensitive domain such as healthcare, life of people is controlled and it is very important to access fast to health information especially in case of emergency (e.g., allergies and chronic diseases). For that, agent paradigm is very promising for ubiquitous healthcare systems. But, the inherent complexity of information security is bigger because agents are characterized by autonomy, intelligence and not under the control of a single entity. Indeed, the big challenge in agent-based ubiquitous healthcare systems is to assure that emergency workers and doctors can access personal information fast and whenever needed but with high level security. The use of cryptography concepts saturates agents embedded in limited resource devices and obstructs healthcare workers. The idea is to lighten agents with simple cryptography concepts while strengthening the surveillance and make it collaborative. Consequently, all agents of the system are concerned by security and collaborate to maintain it. This paper addresses security challenges in ubiquitous healthcare systems based on agents and presents a collaborative approach. Proposed agents are implemented in JADE-Leap platform designed for restricted devices.
    Keywords: Security; U-healthcare; Collaboration; Collective Decision; Ubiquitous Agents;.

  • Statistical Assessment of Nonlinear Manifold Detection Based Software Defect Prediction Techniques   Order a copy of this article
    by Soumi Ghosh, Ajay Rana, Vineet Kansal 
    Abstract: Prediction of software defects has immense importance for obtaining improved and desired outcome at minimized cost and lesser time. Defect prediction in software system has attracted researchers to work on this topic applying various techniques. But those were not found to be fully effective. Software datasets comprise of redundant or undesired features that hinders effective application of techniques resulting more time consuming and inappropriate prediction of defective areas of software. Hence, it is required to apply proper techniques for accurate software defect prediction. A newer application of Nonlinear Manifold Detection Techniques (Nonlinear MDTs) has been examined and accurate prediction of defects in lesser time and cost by using different classification techniques. In this work, we analyzed and tested the effect of Nonlinear MDTs to find out the best classification technique with higher accuracy for all software datasets. Comparison has been made between the results obtained by using without or with Nonlinear MDTs for estimating better performance of classifier by reducing dimensions. Paired Two-tailed T-Test has been performed for statistically testing and verifying the performance of classifiers using Nonlinear MDTs on all datasets. The outcome revealed that among all Nonlinear MDTs, FastMVU makes most accurate prediction of software defects in case of most of the classification techniques.
    Keywords: Dimensionality Reduction; FastMVU; Machine Learning; Manifold Detection; Nonlinear; Promise Repository; Software Defect Prediction.

  • A game-based virtual machine pricing mechanism in federated clouds   Order a copy of this article
    by Ying Hu 
    Abstract: In a federated cloud environment, diverse pricing schemes among different IaaS service providers (ISPs) form a complex economic landscape that nurtures the market of cloud brokers. Although pricing mechanisms have been proposed in the past few years, few of them address the issue of competitive and cooperative behaviours among different ISPs. In this paper, we employ the learning curve to model the operation cost of ISPs, and introduce a novel algorithm that determines the cooperative pricing mechanism among different ISPs. The cooperation decision algorithm uses the operation cost computed based on the learning curve model and price policies obtained from the competition part as parameters to calculate the final revenue when outsourcing or locally satisfying users resource requests. Extensive experiments are conducted in a real-world federated cloud platform, and the experimental results are compared with three existing pricing mechanisms. Our experimental results show that the proposed pricing mechanism is effective to improve resource utilization as well as reduce the profit loss caused by request rejection.
    Keywords: cloud computing; pricing mechanism; resource market; game theory.

  • Real Time Path Planning for High Speed UGVs   Order a copy of this article
    by Ajith Gopal, Elsmari Wium 
    Abstract: The application of a modified A-Star (A*) global search algorithm and trajectory planner based on the tentacles algorithm approach are investigated for real time path and trajectory planning on an unmanned ground vehicle operating at a speed of 40km/h. The fundamental assumption made is that for high speed applications, the requirement for an optimal path is secondary to the requirement for short processing times, provided that a solution, if it exists, is found. The proposed solution is benchmarked against the original A* algorithm and shows a reduction in search space of up to 84% and a reduction in processing time of up to 97%. Results for the trajectory planner are also presented, though no direct comparative evaluation against the original tentacles algorithm was executed. The combined path and trajectory processing time of the proposed solution translates to less than 2mm of travel distance before a reaction to a change in the environment can be processed.
    Keywords: Path Planning; Trajectory Planning; UGV; Real Time; A-Star.

  • Detection of Glaucoma based on Cup-to-Disc Ratio using Fundus Images   Order a copy of this article
    by Imran Qureshi, Muhammad Attique, Muhammad Sharif, Tanzila Saba 
    Abstract: Glaucoma is a permanent damage of optic nerves which cause of partial or complete visual loss. This work presents a glaucoma detection scheme by measuring CDR from fundus photographs. The proposed system consists of image acquisition, feature extraction and glaucoma assessment steps. Image acquisition discusses the transformation of a RGB fundus image into grey form and enhancing the contrast of fundus features. While, boundary of optic disc and cup were segmented in feature extraction step. Finally, a cup-to-disc ratio of an exploited image will compute to assess glaucoma in the image. The proposed system is tested on 398 fundus images from four publicly available datasets, obtaining an average value of sensitivity 90.6%, specificity 97% and accuracy 96.1% in glaucoma diagnosis. The achieved results show the suitability of proposed art for glaucoma detection.
    Keywords: Cup-to-disc ratio (CDR); fundus images; glaucoma; image processing; optic disc; segmentation.

    by Werneld Ngongi, Jialu Du, Rui Wang 
    Abstract: This paper presents a generalized predictive control algorithm (GPCA) for ship dynamic positioning (DP) controller using Controlled Autoregressive Integral Moving Average (CARIMA) model to describe the controlled object. The proposed control system is capable of making position and heading of the ship converge to the desired values by choosing the error correction coefficient, parameter adaptation and feedback correction techniques. Firstly, the basic principle of the generalized predictive control algorithm is introduced. Secondly, the generalized predictive control algorithm is used to design the ship dynamic positioning controller. Finally, the simulation of the designed controller is given. Simulation results prove the effectiveness and robustness of the controller.
    Keywords: Dynamic positioning; Generalized Predictive Controller; feedback correction; Rolling Optimization; performance index; surface ships.

  • Meta-Heuristic Techniques for Path Planning: Recent Trends and Advancements   Order a copy of this article
    by Monica Sood, Vinod Kumar Panchal 
    Abstract: Path planning is a propitious research domain with extensive application areas. It is the procedure to construct a collision-free path from specified source to destination point. Earlier, classical techniques were widely implemented to solve path planning problems. Classical techniques are very easy to implement but they are time-consuming and are not effective in case of uncertainties. But meta-heuristic techniques have the ability even to perform in an approximate and uncertain environment. This makes the use of meta-heuristic techniques in a more focused manner for the optimal path planning research. This paper presents the Overview, recent trends and advancement from year 2001 to 2017 in the field of optimal path planning using meta-heuristic techniques. During the study, different meta-heuristic algorithms are analyzed and classified into three categories: swarm based meta-heuristic techniques, other than swarm based techniques and combinational meta-heuristic techniques. In addition, basic understanding and applicability of specific algorithms for path planning are also discussed along with its strengths and downsides.
    Keywords: Path Planning; Meta-Heuristic Techniques; Optimization; Swarm Intelligence; Artificial Intelligence; Machine Learning; Computational Intelligence.

  • A Novel and Improved Developer Rank Algorithm for Bug Assignment   Order a copy of this article
    by Asmita Yadav, Sandeep Kumar Singh 
    Abstract: Analytical studies on automatic bug triaging approach have the main objective to recommend appropriate developer for bug report with reduced bug tossing length, time and effort in bug resolution. In bug triaging process, if the first recommended developer cannot fix a bug, it is tossed to another developer and the tossing process is continued till the bug gets assigned and resolved. Existing approaches to the best of our knowledge have not considered developers contributions and performance assessment metrics for bug triaging process. In this paper, we proposed a novel and improved two phase Bug Triager that involves a developer profile creation and assignment phases. In this, developer profile is built by using individual contributions (IC) and performance assessment (PA) metrics. Contribution and performance of a developer in pre-fixed bug reports are analyzed to calculate a developers weighted score. This score indicates the level of expertise to fix and resolve a newly reported bug. This approach is tested on two open source projects- Eclipse and Mozilla. Empirical results show that proposed approach has achieved a significantly higher F-score up to 90% for both projects and has effectively reduced bug tossing length up to 11.8% as compared to existing approaches.
    Keywords: Bug Repository; Bug Triaging; Developer’ Expertise; Bug Assignment; Bug Reports; Bug Tossing; Developer Contribution Assessment.

  • Biased Face Patching Approach for Age Invariant Face Recognition using Convolutional Neural Network   Order a copy of this article
    by Mrudula Nimbarte, Kishor K. Bhoyar 
    Abstract: In recent years, a lot of interest is observed among researchers, in the domain of age invariant face recognition. The growing research interest is due to its commercial applications in many real-world scenarios. Many researchers have proposed innovative approaches to solve this problem, but still there is a significant gap. In this paper, we propose a novel technique to fill in the gap, where instead of using a whole face of a person, we use horizontal and vertical face patches. Two different feature vectors are obtained from these patches using Convolutional Neural Networks (CNN). Then fusion of these two feature vectors is done using weighted average of features of both patches. Lastly, SVM is used as a classifier on the fused vector. Two publicly available datasets, FGNET and MORPH (Album 2) are used for testing the performance of the system. This novel approach outperforms the other contemporary approaches with very good Rank-1 recognition rate, on both datasets.
    Keywords: Face Recognition; AIFR; Aging Model; Deep Learning; CNN; Weighted Average.

  • Automatic Sizing of CMOS based Analog Circuits Using Cuckoo Search Algorithm   Order a copy of this article
    by Pankaj Prajapati 
    Abstract: The increasing complexity of physical models ofMOSFETand process variations with downscaling of CMOS technology have made the manual design of analog circuits challenging and time-consuming. Therefore, development of efficient automatic analog circuit design techniques looks very attractive. In this work, the Cuckoo Search (CS) algorithm has been tested for the optimum design of CMOS based analog circuits with high optimization fitness. The CS algorithm has been implemented using C language and interfaced with Ng-spice circuit simulator. In this work, the CS algorithm has been used as a searching tool for transistor sizing and Ng-spice has been used as a fitness creator. Various analog circuits like CMOS common-source amplifier, CMOS cascode amplifier and CMOS differential amplifier using a current mirror load have been optimized using this automatic optimization tool with BSIM3v3MOSFETmodels using 180 nm CMOS technology. This technique gives more accurate results and consumes less time as compared to manual circuit design.
    Keywords: Cuckoo Search algorithm; Optimization; Fitness; Simulator ; Transistor Sizing.

  • Product Service Model Constructing Method for intelligent home based on Positive Creative Design Thinking   Order a copy of this article
    by Weiwei Wang, Ting Wei 
    Abstract: With the booming market economy, companies need to maintain competitive advantage through positive and innovative design thinking. Building a service model is an indispensable part of this approach. The design aims to improve the competitiveness of the enterprise by extracting effective user value, establishing a product service system, satisfying the user's needs, and analyzing the method of extracting the shape. In this paper, the researcher first selects the target user, draws the user journey map, analyzes the user's psychological activities, and uses the innovative design thinking to extract the user value. Secondly, according to the positive value element, the human-object three-dimensional ecological circle is created, at the same time, using AHP Hierarchical Analysis software to analyze product modeling and build the product service model. Finally, the reliability of the model construction method was verified by the intelligent air-housekeeper product service system, and the requirements to meet the needs of users were met. At the same time, it can also provide certain reference for other product service design, reflecting the market competitive advantage of the product.
    Keywords: Product Service Model; User Value; Positive Creative Design Thinking; Intelligent Air-housekeeper Product Service.

  • Local and Global Features Fusion to estimate Expression Invariant Human Age   Order a copy of this article
    by Subhash Chand Agrawal, Anand Singh Jalal, Rajesh Kumar Tripathi 
    Abstract: Human beings can easily estimate the age or age group of a person from a facial image where as this capability is not prominent in machines. This problem becomes more complex in the presence of facial expressions and due to age progression. In this paper, we introduced a novel method for age prediction using combination of local and global features. After detecting the face from image, we partition the facial image in 16*16 non-overlapping blocks and apply Grey-Level Co-Occurrence matrix (GLCM) on these blocks. After calculating four facial parts (Eyes, forehead, left cheek and right cheek) from facial image, features from second local feature Gabor filter are obtained. Global feature, Histogram of Oriented Gradients (HOG) is used to extract features from complete face image. Experimental results show that fusion of local and global features perform better than existing approaches and reported 6.31years mean absolute error (MAE) on PAL dataset.
    Keywords: GLCM; Local feature; Global feature; Facial Expression.

  • Improving English-Arabic statistical machine translation with morpho-syntactic and semantic word class   Order a copy of this article
    by Ines Turki 
    Abstract: In this paper, we present a new method for the extraction and integrating of morpho-syntactic and semantic word classes in a Statistical Machine Translation (SMT) context to improve the quality of English-Arabic translation. It can be applied across different statistical machine translations and with languages that have complicated morphological paradigms. In our method, we first identify morpho-syntactic word classes to build up our statistical language model. Then, we apply a semantic word clustering algorithm for English. The obtained semantic word classes are projected from the English side to the featured Arabic side. This projection is based on available word alignment provided by the alignment step using GIZA++ tool. Finally, we apply a new process to incorporate semantic classes in order to improve the SMT quality. We show its efficacy on small and larger English to Arabic translation tasks. The experimental results show that introducing morpho-syntactic and semantic word classes achieves 7.7 % of relative improvement on the BLEU score.
    Keywords: Morpho-syntactic word classes; semantic word classes; alignment; Statistical machine translation.

  • A QoS-aware virtual resource pricing service based on game theory in federated clouds   Order a copy of this article
    by Tienan Zhang 
    Abstract: Recently, federated cloud platform has become a promising paradigm to provide cloud services for various kinds of users in a distributed manner. To compete for cloud users, it is critically important for each cloud provider to select an optimal price that best corresponds to their service qualities as well as remains attractive to cloud users. In this paper, we first formulate the pricing strategy of individual cloud provider as a constrained optimization programming problem to analyze the behaviours of both cloud users and cloud providers. Then, we present game-based model which introduces a set of virtual resource agents to help providers adjusting their prices with aiming at achieving a global optimal solution. Theoretical analysis is present to prove the validity and effectiveness of the proposed game model, and extensive experiments are conducted in a real-world cloud platform to evaluate its performance. The experimental results show that the proposed pricing model can significantly improve the resource revenue for cloud providers and provide desirable quality-of-service (QoS) for user tasks in terms of various performance metrics.
    Keywords: cloud computing; pricing strategy; virtual machine; quality-of-service.

  • On Collaborative Filtering Model Optimized with Multi-Item Attribute Information Space for Enhanced Recommendation Accuracy   Order a copy of this article
    by Folasade Isinkaye, Yetunde Folajimi, Adesesan Adeyemo 
    Abstract: Recommender system is a type of information filtering system that is designed to curtail the difficulties of information overload by automatically suggesting relevant items to users tailored to their preferences. Bayesian Personalized Smart Linear Methods (BPRSLIM) is a variant of item-based collaborative filtering technique used in information filtering system. Although, this algorithm has shown outstanding performance in a range of applications, nevertheless it suffers serious limitation of inability to provide accurate and reliable recommendations when the user-item matrix contains insufficient rating information, this always reduces its accuracy. In this paper, we propose a framework that integrates multi-item attribute information besides the classic information of users and items into BPRSLIM model in order to ease the sparsity problem associated with it and hence improves its performance accuracy. The enhanced model is expected to outperform the original BPRSLIM model
    Keywords: BPRSLIM; Sparsity Problem; Recommender System; Collaborative Filtering; Item Attribute Information; Optimization.

Special Issue on: IRICT 2017 Reliable and Intelligent Information and Communication Technology

  • Interest emotion recognition approach using self-organizing map and motion estimation   Order a copy of this article
    by Kenza Belhouchette, Mohamed Berkane, Hacene Belhadef 
    Abstract: Recognizing human facial expressions and emotions by computer is an interesting and challenging problem. Its usefulness may appear in various fields such as e-learning. Although several approaches have been proposed to recognize emotions based on facial expressions, the recognition rate, amount of used resources and calculation time remain factors for improvement. Our work presents a new approach for recognizing basic emotions (joy, sadness, anger, disgust, surprise and fear) in image sequences. We introduced interest emotion and created its corresponding action units (AUs) based on psychological foundations. Our approach is mainly characterized by minimizing used data and consequently optimizing the computing time and improving the recognition rate. The proposed approach was divided into three steps. The first step is face detection using the method developed by Viola and Jones. The second step concerns the extraction of facial features. At this level, we exploited the Facial Action Coding System proposed by Paul Ekman, which is based on AUs. To detect AUs, we extracted face strategic points (inner, outer and centre points of the eyebrow; centre points of the lower and upper eyelids; right, left, top and bottom corners of the mouth; and left and right external nose wing) using an active appearance model and a block-matching approach. At the last step, we classified the results by using the Kohonen self-organizing map.
    Keywords: Emotion; Interest; neural network; Kohonen; action units; facial expression; bloc matching.

  • Arabic sign language recognition using vision and hand tracking features with HMM   Order a copy of this article
    by Ala Addin Sidig, Hamzah Luqman, Sabri Mahmoud 
    Abstract: Sign language employs signs made by hands and facial expressions to convey meaning. Sign language recognition facilitates the communication between community and hearing-impaired people. This work proposes a recognition system for Arabic sign language using four types of features, namely Modified Fourier Transform, Local Binary Pattern, Histogram of Oriented Gradients, and a combination of Histogram of Oriented Gradients and Histogram of Optical Flow. These features are evaluated using Hidden Markov Model on two databases. The best performance is achieved with Modified Fourier Transform and Histogram of Oriented Gradients features with 99.11% and 99.33% accuracies, respectively. In addition, two algorithms are proposed, one for segmenting sign video streams captured by Microsoft Kinect V2 into signs and the second for hand detection in video streams. The obtained results show that our algorithms are efficient in segmenting sign video streams and detecting hands in video streams.
    Keywords: Arabic Sign language; Sign language recognition; video segmentation; Histogram of Oriented Gradients; Hands detection; Hidden Markov Model.

  • Quality of Service (QoS) Task Scheduling Algorithm for time-cost trade off Scheduling Problem in Cloud Computing Environment   Order a copy of this article
    by DANLAMI GABI, Abdul Samad Ismail, Anazida Zainal, Zailmiyah Zakaria 
    Abstract: As cloud computing environment is evolving, managing trade-offs between time and cost when executing large-scale tasks to guarantee customers minimum running time and cost of computation is not always feasible. Many heuristics and metaheuristics have been proposed to resolve this problem. The metaheuristics are considered promising, since they can schedule large-scale tasks as well optimise the best-known trade-offs among conflicting objectives and return solution in just one run. However, they are characterised with certain limitations that need to be resolve, which include local trapping, poor convergence and imbalance between global and local search to enhanced their solution findings. In this paper, we first present a multi-objective task scheduling model upon which a dynamic Multi-Objective Orthogonal Taguchi-Based Cat Swarm Optimisation (dMOOTC) Algorithm is proposed to solve the model. In the proposed algorithm, Taguchi Orthogonal method is incorporated into the local search of a conventional Cat Swarm Optimisation (CSO) to overcome local trapping and ensure it diversity. Pareto-Optimisation strategy incorporated within the algorithm is used to balance solution of the global search and local search. The efficiency of the proposed algorithm is studied by simulation with CloudSim tool. Thirty independent simulation runs where conducted and results is evaluated based on the following metrics, i.e., execution time, execution cost and Performance Improvement Rate Percentage (PIR%). The results of the simulations showed the proposed dMOOTC algorithm can select the best known optimal trade-off values that can minimised the execution time and execution cost than single objective conventional Cat Swarm Optimization (CSO), Multi-Objective Particle Swarm Optimization (MOPSO), Enhanced Parallel CSO (EPCSO) and Orthogonal Taguchi Based-Cat Swarm Optimization (OTB-CSO) algorithms.
    Keywords: Multi-Objective; Quality of Service; Task Scheduling; Cat Swarm Optimisation; Pareto-Optimisation.

  • Data stream management system for video on demand hybrid storage server   Order a copy of this article
    by Ola Al-Wesabi, Nibras Abdullah 
    Abstract: The storage device is one of the main components of video on demand (VOD) server. The VOD storage system is responsible for storing and streaming large videos. Hence, the VOD server requires a large storage capacity and rapid video retrieval from this storage to quickly stream these videos to users. The hybrid storage system, which combines hard disk drive (HDD) and solid-state-drive (SSD) components in the server, has become popular because of such requirements. HDD is becoming economical and is providing a high storage capacity for numerous videos. Moreover, the SSD can act as a buffer for fast retrieval and streaming of videos to users. The combination of both storage modes is relatively weak in terms of optimizing fast access prior and in supporting the production of a high number of simultaneous streams. This paper presents the proposed VOD storage server system, namely, enhanced hybrid storage system (EHSS) based VOD server to improve the performance of the VOD server. The design of the EHSS and its streaming management scheme produce high performance and satisfy the performance requirements of a VOD server in terms of I/O throughput and access latency. The experimental results show that the proposed VOD server-based EHSS with the proposed DSC scheme provides better performance than the VOD server-based FADM because it enhances the average response times for the various scales of intensive workload by 69.89%.
    Keywords: Data stream controller (DSC); Hard disk drives (HDDs); Cache hit ratio; Hybrid storage; I/O response time; Solid-state-drive (SSD); Throughput; Video on demand (VOD).

Special Issue on: Inventive Systems and Internet of Things

  • Mixed Integer Programming for Vehicle Routing Problem with Time Windows   Order a copy of this article
    by Divya Aggarwal, Vijay Kumar 
    Abstract: Being a key element in logistics distribution, Vehicle Routing Problem becomes an importance research topic in management and computation science. Vehicle Routing Problem with time windows is a specialization of Vehicle Routing Problem. In this paper, a brief description of vehicle routing problem is presented. A Mixed Integer Programming is utilized to solve the vehicle routing problem with time windows. A novel mathematical model of MIP is formulated and implemented using IBM CPLEX. A novel constraint is designed to optimize the number of vehicle used. The proposed model is used to optimize both transportation cost and number of vehicle used simultaneously. The proposed model is tested on two well-known instances of Solomons benchmark test problem. Experimental results illustrate that the proposed formulation provides promising solutions in reasonable computation time. The sensitivity analysis of customer nodes is also studied.
    Keywords: Vehicle Routing Problem; Time Windows; Solomon’s Instance; CPLEX; MIP.

    by Sivaraman Eswaran, Manickachezian R 
    Abstract: Optimal management of the cloud resources for multimedia contents is the important aim of this research. In our previous work, Multiple Kernel Learning with Support Vector Machine (MKL-SVM) is introduced which can achieve a balanced resource usage with multimedia user request. However existing work do not concentrate on caching mechanism which might lead to more computational overhead. To solve this problem, new method is proposed namely Improved Storage and Scheduling of Multimedia Contents in Cloud Storage (ISS-MCCS). In this work, Fuzzy Neural Network Classification (FNNC) is utilized for handling the server clusters with unevenness. Then task scheduling is done using Hybrid Genetic-Cuckoo Search Algorithm (HGCSA) where Hybrid fuzzy weighting scheme is used for the fitness evaluation. Finally Adaptive Replacement Cache (ARS) is integrated to optimize memory. The overall assessment of the research work is done in cloudsim environment which proves it can manage the multimedia contents with efficiently.
    Keywords: Multimedia contents; multiple QoS; optimized scheduling; efficient load balancing; adaptive replacement; fuzzy neural network classification.

  • Context Aware Reliable Sensor Selection in IoT   Order a copy of this article
    by K. R. Remesh Babu Raman, M. Vishnu Prathap, Philip Samuel 
    Abstract: Internet of Things (IoT) is a computing concept where physical objects with embedded sensors connect to the Internet and can identify themselves to other devices. Today the number of devices connected to the Internet is increasing rapidly and they want to communicate each other for different purposes. The future Internet will comprise of billions of intelligent communicating objects having capabilities for sensing, actuating, and data processing. Each object in this Cyber Physical Systems (CPS) will have one or more embedded sensors that will capture huge amount of data. Managing these data in cloud and obtaining the relevant data from appropriate sensors are important concerns. For information retrieval context awareness is important. Usually users need information from these sensors depending upon several factors like location, accuracy level, etc. The proposed method senses reliable data from a sensor environment that satisfies user contexts. It also contains different functionalities like user addition, location sensing, context specification, user context counting, and selection of current best results.
    Keywords: Cloud Computing; Context Awareness; IoT; Sensor selection; Cloud of Things; Resource management.

  • Lagrangian Relaxation for Distribution Networks with Cross-Docking Center   Order a copy of this article
    by Manpreet Singh Bhangu, Rimmi Anand, Vijay Kumar 
    Abstract: This paper proposes a cross-docking in logistics network that aims to reduce the transportation costs. The proposed strategy eliminates need of inventory for storing commodities. It consists of two main stages. During first echelon, the number and location of cross-dock warehouses are determined. In second echelon, the allocation of warehouse as cross-dock to distribution centers is determined. Each warehouse has limited capacity and distribution center can supplied by only one cross-dock. In this paper, the problem is mathematical formulated using a mixed-integer programming model. This formulation is used the concept of cross-docking allocation and commodities distribution. The Lagrangian relaxation approach is proposed to solve logistics network problem. Experimental results reveal that the proposed approach provides optimal solution in a reasonable time.
    Keywords: Facility location; Network design; Mixed-integer programming; Lagrangian relaxation; Merge-in-Transit.

  • Factors Influencing Regression Testing on Cloud and On-Premises: An Analysis   Order a copy of this article
    by Suma V, Narasimha Murthy M S 
    Abstract: Since, the evolution of software, it has laid its impact in all domains of operations wherein software industry has taken up the major role. Hence, it is required to upgrade software as the market demands so that it can sustain in the industrial environment. However, one of the most important criteria for software industries to survive is development of high quality software which can completely satisfy customers. In order to achieve the above mentioned goal, it becomes mandatory for the software organizations to adopt themselves to the market dynamics. Since, cloud computing has not taken wide popularity as one of the promising technology in the current situation, IT services have marched themselves to serve the needs of the society through cloud based technology. Testing applications using cloud has further become one of the dominating areas of operations. The main objective of this paper is therefore to analyse the effectiveness, and efficiency of testing applications in cloud environment. This paper further put forth a case study which involves an empirical investigation carried out in leading software company follows cloud technology in their day to day flow of developmental activities. A comprehensive analysis is conducted upon sampled data which is collected in two different domains namely health care and telecom domain of the company under investigation. From the analysis, it is observed that testing applications in cloud model is a good practice against conventional mode of software testing (on-premises). The results also depict testing applications in cloud environment improves the performance of various parameters of testing process. Further, this inference paves way to carry out further research to formulate effective strategies to test applications in cloud.
    Keywords: Software Engineering; Software Testing; Cloud Computing; Regression Testing; Software Quality; Total Customer Satisfaction.

  • A Hybrid Test Prioritization Technique for Combinatorial Testing   Order a copy of this article
    by Preeti Satish, Krishnan Rangarajan 
    Abstract: IoT systems comprise of multiple devices connected together, to perform an intelligent task in real time. Such systems have to be meticulously tested in order to avoid hazards situations. Combinatorial testing technique can effectively test such complex IoT systems with reduced effort as it generates fewer test cases with adequate coverage. It basically tests the interactions that exist between values of different parameters and practically faults upto six-way interactions have been found successfully. Prioritization of combinatorial tests deals with finding an ideal order of the test cases so that faults are detected early. Recent approaches to prioritization problem are either coverage based or parameter-value weight based for two-way or three-way interaction strengths separately. In this paper, we present a hybrid prioritization technique for combinatorial testing that combines both weight based and interaction coverage based approaches. We derive a combined weight to each test case considering user given weights denoting the importance of values of parameters and the interaction coverage up to six-way interactions. To the best of our knowledge, no research has been carried out yet, that accounts for higher order combinations up to six-way at a time. To demonstrate the effectiveness of our algorithm, we have conducted initial synthetic experiments on various covering arrays, and measured the effectiveness with t-Rate of fault of detection metric. The results are promising in covering the combinations early.
    Keywords: IoT systems; combinatorial testing; prioritization; combinatorial coverage; weight based; hybrid technique; interaction testing; interaction strength.

  • Intelligent systems for Redundancy Removal with Proficient Run Length Coding and statistical analysis using regression   Order a copy of this article
    by V.R. PRAKASH, S. Nagarajan 
    Abstract: The surveillance video aspect has been one of the key technologies in various tactical monitoring. However, the quantum of analysis with proper implication of video quality subjected to enormous amount of time might degrade its error metrics. So in order to analyse this quantum has been made with the hierarchical order wherein four videos where taken and its peak errors where being analysed. The significance of the work is dealt with feature extraction and then comparison with input and extracted texture followed by feature analysis with cosine angle distance. Finally, a multiple regression analysis has been developed with PSNR as dependant variable where video size and execution time are taken as independent variable. The significance of regression has been based on prediction equation has been done in order to near optimality of PSNR value for varying video size and execution time.
    Keywords: Proficient Run Length Coding; Regression analysis.

  • An Intelligent Inventive System for Personalized Web Page Recommendation based on Ontology Semantics   Order a copy of this article
    by Gerard Deepak, Ansaf Ahmed, Skanda B 
    Abstract: Owing to the information diversity in the Web and its dynamically changing contents, extraction of relevant information from the web is a huge challenge. With the World Wide Web transforming into a more organized Semantic Web, the incorporation of Semantic techniques to retrieve relevant information is highly necessary. In this paper, a dynamic ontology alignment technique for recommending relevant Web Pages is proposed. The strategy focuses on knowledge tree construction by computing the semantic similarity between the query terms as well as the ontological entities. Furthermore, the semantic similarity is again computed between nodes of the constructed knowledge tree and URLs in the URL repository to recommend relevant Web Pages. The dynamic ontology alignment by computing their respective semantic similarity constitutes Ontology Semantics. Personalization is achieved by prioritization of Web Pages by Content Based Analysis of the users Web Usage Data. An overall accuracy of 87.73 % is achieved by the proposed approach.
    Keywords: Ontologies; Personalized; Semantic Strategy; Web Page Recommendation System; Web Search.

    by Vijaya Bharathi Manjeti, Koteswara Rao Kodepogu 
    Abstract: Cutting edge frameworks are turning out to be exceedingly configurable to fulfil the shifting needs of clients and clients. Programming product offerings are consequently turning into a typical pattern in programming improvement to decrease cost by empowering deliberate, expansive scale reuse. A few shortcomings may be uncovered just if a specific mix of components is chosen in the conveyed items. Yet, testing all mixes is typically not possible by and by, because of their to a great degree extensive numbers. Combinatorial testing is a method to produce littler test suites for which all mixes of t elements are ensured to be tried. In this paper, we display a few hypotheses depicting the likelihood of irregular testing to recognize connection blames and contrast the outcomes with combinatorial testing. For instance, an irregular testing turns out to be considerably more viable as the quantity of elements increments and focalizes toward equivalent adequacy with combinatorial testing. Be that as it may, when imperatives are available among elements, then irregular testing can passage subjectively more awful than combinatorial testing. Subsequently, with a specific end goal to have a reasonable effect, future research ought to concentrate on combinatorial testing.
    Keywords: fault; framework; vantage.

  • Effect of Magnetizing Core on Impedance and Induced EMF of Two Coils Wound on Single Iron Core   Order a copy of this article
    Abstract: A copper wire wound on a iron rod is called as iron cored inductor. Basically iron cored inductor having two components resistance and inductance. These two parameters depend up on the permeability of the core. In this paper variation of induced EMF in the secondary coil placed on the same core at different magnetic fields are presented. A detailed study of dependence of resistance and inductance is also presented in this paper.
    Keywords: Magnetization; Iron cored inductor; Induced EMF.