Forthcoming articles

 


International Journal of Intelligent Engineering Informatics

 

These articles have been peer-reviewed and accepted for publication in IJIEI, but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.

 

Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.

 

Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.

 

Articles marked with this Open Access icon are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.

 

Register for our alerting service, which notifies you by email when new issues of IJIEI are published online.

 

We also offer RSS feeds which provide timely updates of tables of contents, newly published articles and calls for papers.

 

International Journal of Intelligent Engineering Informatics (31 papers in press)

 

Regular Issues

 

  • A Twofold Self-Healing Approach for MANET Survivability Reinforcement   Order a copy of this article
    by Leila Mechtri, Fatiha Djemili Tolba, Salim Ghanemi, Damien Magoni 
    Abstract: Distributed systems are by nature fault-prone systems. The situation becomes more complex in the presence of intrusions that continue to grow in both number and severity, especially in open environments like MANET. In this paper, we present a twofold self-healing approach to reinforce MANET survivability. First, a fault-tolerant IDS is designed by replication of individual agents within MASID to ensure continuous supervision of the network. However, since not all intrusions are predictable, there might have been some serious effects on the network before being detected and completely removed. For that, even if the implications of intrusions could be minimized by the intrusion detection system MASID, still the need for the recovery of altered or deleted data is a vital step to ensure the correct functioning of the network. For that, a recovery-oriented approach for a self-healing MANET is also presented. It is based on the ability of MASID-R to assess the damage caused by the detected intrusions and aimed at enabling the supervised network to heal itself of those faults and damages. Simulations using ns-2 have been performed to study the feasibility and prove the optimality of the proposed approach.
    Keywords: Survivability; Fault-tolerance; Intrusion Detection; Self-healing; Replication; Recovery-oriented Approach; MANET.

  • Increasing the Hiding Capacity in Image Steganography using Braille Code.   Order a copy of this article
    by Mona A. S. Ali, Essam H. Houssein, Noha Eldemerdash, Aboul Ella Hassanien 
    Abstract: Least Significant Bit insertion (LSB) steganography is a one of the most widely used methods for implementing covert data channels in image file exchanges. This popularity comes from its simplicity in implementation and low computational complexity of the algorithm moreover the primary reason being low image distortion. Many researchers try to increase the embedding capacity of LSB algorithm by increasing the layers of the image by keeping the image with minimal distortion effects. This paper, introduces a new approach for embedding the data within the images using Braille code and bit-slicing technique. It will be shown that this unique steganography method has minimal visual distortion affects while also hiding the message with a secure and small code.
    Keywords: Steganography; bit-slicing technique; Braille method.

  • ECG Signals Classification: A review   Order a copy of this article
    by Essam Houssein, Moataz Kilany, Aboul Ella Hassanien 
    Abstract: Electrocardiogram (ECG), non-stationary signals, is extensively used to evaluate the rate and tuning of heartbeats. Comparison of overall ECG waveform pattern and shape qualifies doctors to diagnose possible diseases. This paper presents various applications of feature extraction and Machine Learning (ML) used in ECG classification. The main purpose of this paper is to provide an overview of utilizing ML and swarm optimization algorithms in ECG signals to obtain the best performance accuracy in order to recognize the abnormal Cardiovascular Diseases (CVDs). Furthermore, ECG feature extraction is the main stage in ECG signal classification to find a set of relevant features of ECG data that can attain the best classification accuracy performance and there are several diverse of the classifier. Swarm optimization algorithm is combined with classifiers for the purpose of searching the best value of classification parameters that best fits its discriminant purpose, and by looking for the best subset of features that produce the highest classification performance. Finally, this paper introduces an ECG heartbeat classification approach based on the Water Wave Optimization (WWO) and SVM. Published literature presented in this paper indicates the potential of ANN and SVM as a useful tool for ECG classification. Author strongly believes that this review will be quite useful to the researchers, scientific engineers working in this area to find out the relevant references and the current state of the field.
    Keywords: Electrocardiogram (ECG); Feature extraction; Feature optimization; Classification; Artificial Neural Networks (ANNs); Support Vector Machines (SVMs).

  • Assessing visual control activities in ceramic tile surface defect detection: an eye-tracking study   Order a copy of this article
    by Berna Ulutas, N. Firat Ozkan 
    Abstract: Digital cameras and image processing algorithms may be helpful in inspection and classification of ceramic tiles in a production line. However, workers decision making capability and ability to tolerate some type of defects are the main reasons for several firms to still rely on human visual inspection. Further, it is believed that the investment and maintenance costs of the automated systems may be higher than labor costs. This study considers a ceramic tile line where workers are assigned to identify tile surface defects. Main aim is to attract attention to differences between novice and expert workers in terms of visual scanning performance and mental workload indicators that result from high concentration during visual inspection. A mobile type eye-tracker is used to record the data for duration of fixation and number of fixations to determine fatigue that arises over a period of working time. Data are analyzed and it is concluded that the eye tracking systems have a potential to identify human related problems during visual inspection.
    Keywords: eye-tracking; ceramic tile manufacturing; visual inspection; surface inspection; analyzing human work.
    DOI: 10.1504/IJIEI.2017.10008451
     
  • An Intelligent Undersampling Technique based upon Intuitionistic Fuzzy sets to alleviate Class Imbalance Problem of Classification with Noisy Environment   Order a copy of this article
    by Prabhjot Kaur, Anjana Gosain 
    Abstract: Traditional classification algorithms (TCA) does not work with the unequal class sizes. There are applications wherein the requirement is to discover the exceptional/rare cases such as frauds in credit card database or fraudulent mobile calls etc. TCA, when applied in such cases, are failed to detect rare cases. This is stated as the problem of imbalance classes. The problem is more serious when TCA are applied on the data distribution having other impurities like noise, overlapping classes and imbalance within classes. This paper presented an intelligent undersampling and ensemble based classification method to resolve the problem of imbalanced classes in noisy situation. A synthetic data-sets with different extent of noise is used to assess the classification performance of the proposed techniques. The results indicate that the presented undersampling and ensemble based classifier techniques has better classification performance in noisy situation when we compare them with RUS and SMOTE having classifiers like C4.5, RIPPLE, KNN, SVM, MLP, Naivebayes and with the ensemble techniques like Boosting, Bagging and RandomForest
    Keywords: Class Imbalance; Intuitionistic Fuzzy Set; Undersampling; Class imbalance Learning; skewed distribution; Noisy environment; data level methods; ensemble approaches; Bagging; Boosting; Randomforest; Noise detection.

  • The application of controlled natural language on carbon market domain knowledge for enhanced retrieval of information   Order a copy of this article
    by F.H. Abanda 
    Abstract: Natural language processing has been used in modeling knowledge about different domains for use in different applications, particularly to those with minimal computer science skills. Although natural language is easy to learn, it is often riddled with ambiguities, vagueness and a very high potential for inconsistencies. Thus, controlled natural language, a subset of natural language has emerged and promises to improve on some major natural language deficiencies. Given the multiplicity of controlled natural languages, Attempto Controlled English (ACE), one of the leading controlled natural languages, was chosen and explored to represent knowledge about the domain of carbon market. The outcome of this study is a prototypical controlled natural language carbon market ontology that can be used in making informed decisions about investment in carbon marketing. In order to establish the suitability and purpose for which the prototype was developed, it was evaluated using appropriate techniques and ontology reasoners.
    Keywords: AAttempto Controlled English; carbon market; natural language; ontology.

Special Issue on: Advances in Intelligent Big Data Analytics

  • Empirical Investigation of Dimension Hierarchy Sharing Based Metrics for Multidimensional Schema Understandability   Order a copy of this article
    by Anjana Gosain, Jaspreeti Singh 
    Abstract: Over the last years quality has gained lot of importance in the development of data warehouse systems. Predicting understandability of multidimensional schemas could play a key role in controlling data warehouse quality at early stages of development. In this area, some effort has been spent to define structural metrics and identity models for assessing quality of these systems. Of the structural properties used to define metrics, aspects of dimension hierarchies and its sharing plays primary role to enhance analytical capabilities of multidimensional schemas, thereby affecting their quality. The authors have previously proposed structural metrics based on aforementioned aspects. The objective of this study is to apply Principal Component Analysis (PCA) to find whether our metrics are improvements over the other existing metrics; and to apply Logistic Regression to study whether the metrics (selected as relevant in the extracted principal components) combined together are indicators of multidimensional schema understandability. The results of PCA confirm that our structural metrics based on the concept of sharing are different from other such metrics existing in the literature. Further, the metrics selected as principal components can be used in combination to predict understandability of data warehouse multidimensional schemas.
    Keywords: Data Warehouse; Quality Metrics; Principal Component Analysis; Logistic Regression; Understandability; Multidimensional Schemas.

  • Detecting Concept Drift using HEDDM in Data Stream   Order a copy of this article
    by Snehlata Dongre 
    Abstract: In evolving Data Stream, when its concept undergoes a change it is known as concept drift. Detecting Concept Drift and handling it is a challenging task in Data Stream Mining. If an algorithm is not adapted to Concept Drift, then it directly affects its performance. A number of algorithms have been developed to handle concept drift, but they are not suited for both - Sudden Concept Drift and Gradual Concept Drift. Thus, there is a demand for an algorithm that can react to both the types of concept drift as well as incur less computational cost. A new approach - Hybrid Early drift Detection Method (HEDDM) - has been proposed for drift detection, which works with an ensemble method to improve the performance.
    Keywords: Concept drift; data stream; classification; ensemble classifier; concept drift detection; DDM; EDDM; HEDDM; data stream mining; evolving data stream.

  • Measuring harmfulness of class imbalance by data complexity measures in oversampling methods   Order a copy of this article
    by Deepika Singh, Anjana Gosain, Anju Saha 
    Abstract: Many real world applications consist of skewed datasets which result in class imbalance problem. During classification, class imbalance cause underestimation of minority classes. Researchers have proposed a number of algorithms to deal with this problem. But recent research studies have shown that some skewed datasets are unharmful and applying class imbalance algorithms on these datasets lead to degenerated performance and increased execution time. In this research paper, we have pre-estimated the degree of harmfulness of class imbalance for skewed classification problems, using two of the data complexity measures: scatter matrix based class separability measure and ratio of intra-class versus inter-class nearest neighbors. Also the performance of oversampling based class imbalance classification algorithms have been analyzed with respect to these data complexity measures. The experiments are conducted using k-nearest neighbor (k-nn) and naivebayes as the base classifiers for this study. The obtained results illustrate the usefulness of these measures by providing the prior information about the nature of the imbalance datasets that help us to select the more efficient classification algorithm.
    Keywords: class imbalance; data complexity measure; class separability measure; class overlapping; inter-class nearest neighbor; intra-class nearest neighbor; imbalance ratio; oversampling method.

  • An Ensemble Clustering Method for Intrusion Detection   Order a copy of this article
    by KAPIL WANKHADE 
    Abstract: The amount of data in the field of computer networking growing rapidly and this urge new challenges in the field of an Intrusion Detection System (IDS). To handle such increasing volume of data, new hybrid approach has to be developed to overcome the problems such as high detection rate and low false alarm rate. An Intrusion Detection System plays a vital role against detection of malicious attacks. Data mining and machine learning techniques are important and plays vital role in detection of attacks. This paper mainly focuses on detection rate and false alarm rate so to resolves these problems a hybrid method, ensemble clustering has been proposed. This method tries to increase detection rate with lowering false alarm rate. The method has been tested on KDDCup99 network intrusion dataset and performs well as compared with other algorithms in terms of detection rate false alarm rate.
    Keywords: boosting; classification; clustering; data mining; divide and merge; detection rate; false alarm rate; intrusion detection system; ensemble method; k-means.

  • Dynamic Social Network Analysis and Performance Evaluation   Order a copy of this article
    by Sanur Sharma, Anurag Jain 
    Abstract: Social media in todays age is on a tremendous increase in terms of its usage and the enormous amount of data it generates which includes personal details of users, their images and the content that is being shared on such open source platforms. This has led to a lot of research and analysis of such networks and data that exists in social media. This paper is focused on dynamic analysis of social networks, where snapshots of network are taken at regular intervals and are analysed on various performance measures. The real time email dataset of a company (ENRON) has been evaluated and visualized dynamically. The network measures are evaluated at each timestamp and clustering is performed on that data and its performance is calculated on various measures. Tabu search optimization algorithm has been used for clustering the timestamped data and a comparison is done between the fixed size cluster and variable size clusters. The results suggests that for certain time stamps the value of precision, recall and f measure for fixed size clusters are better than the variable size clusters. These measures can further be used for the selection of the dynamic clustering techniques for social network analysis.
    Keywords: Social Network; Dynamic Social Network; Clustering; Dynamic Network Analysis; Data Mining.

Special Issue on: CODIT'2016 New Trends in Intelligent Systems Modelling and Control

  • Tardiness minimization heuristic for job shop scheduling under uncertainties using group sequences   Order a copy of this article
    by Zakaria YAHOUNI, Nasser MEBARKI, Zaki SARI 
    Abstract: In an industrial environment, manufacturing systems may be subject to considerable uncertainties which could lead to numerous schedule disturbances. These disturbances prevent the execution of a manufacturing schedule as it was planned. The "groups of permutable operations" method copes with this drawback by proposing a family of schedules instead of a unique one. However, the selection of the appropriate schedule that accounts for real-time disturbances, represents a combinatorial optimization challenge. In this paper, we propose a new decision-aid criterion for selecting the schedule that fits best the real state of the shop. This criterion is measured using a greedy heuristic that anticipates the maximum tardiness in a job shop scheduling environment. Simulation tests performed on benchmark problems show the usefulness of the proposed criterion compared to another frequently used criterion. The final results emphasize the usefulness of this criterion in a bi-criteria decision-aid system.
    Keywords: Scheduling; Decision-aid system; Job shop; Maximum tardiness; Optimization.

  • SKEWNESS MAP: ESTIMATING OBJECT ORIENTATION FOR HIGH SPEED 3D OBJECT RETRIEVAL SYSTEM   Order a copy of this article
    by Vicky Sintunata, Kurumi Kaminishi, Terumasa Aoki 
    Abstract: 3D object retrieval system is a system where a similar or the same object in the database should be retrieved given a 2D query image (sketches or photographs). Unfortunately, as the appearance of 3D object might vary depending on the viewing directions, a vast amount of 2D rendered images must be processed (matched) to solve this problem. In this paper, we present a novel method called Skewness Map to relieve this problem. Skewness Map can estimate the orientation of the object and select a few representative images accurately from the database; therefore matching every image in the database can be avoided. Experimental results show the retrieval system becomes much faster (14 times faster in matching time) and accurate in estimating the object orientation (less than one degree error in average).
    Keywords: Skewness; Object Orientation; 3D Object Retrieval System.

  • Enhanced Approach to Cascade Reconfiguration Control Design   Order a copy of this article
    by Dusan Krokavec, Anna Filasova 
    Abstract: Following the concept of fault tolerant control systems, the paper is concerned with the problem of reconfiguration to retain fault tolerance in control of linear continuous-time systems with system dynamics faults. The main idea is to use a reference model output to be followed when a fault occurs, while the nominal control loop structure is kept untouched and the controllers with nominal parameters remains a part of the reconfigured control loop scheme. The full state control principle is applied for nominal control strategy and the static output control principle is proposed for the compensation control law specification. Exploiting the D-stability circle region precept, new conditions for control laws parameter design are introduced and proven as well as stability of the cascadelike reconfiguration structure is analyzed in the paper. To illustrate oncoming properties, the proposed feasible procedure is compared with that which was obtained using Bounded real lemma principle. The results, offering the sufficient and necessary design conditions, are illustrated with a numerical example to note the effectiveness of the proposed approach and its applicability.
    Keywords: fault tolerant control; state control; asymptotic stability; static output control; cascade structures; Lyapunov inequality; linear matrix inequalities;.

  • Discovering dependencies between domains of redox potential and plant defence through triplet extraction and copulas   Order a copy of this article
    by Dragana Miljkovic, Nada Lavrač, Marko Bohanec, Biljana Mileva Boshkoska 
    Abstract: Knowledge discovery, especially in the field of literature mining, is often involved in searching for some interconnecting concepts between two different literature domains, which might bring new understanding of both domains. This paper presents a new approach to discovering dependencies between different biological domains based on copula analysis of literature mining results. More specifically, we have explored dependencies between literature from the domains of plant defence response and redox potential. Copula analysis of triplets, extracted by Bio3graph tool, shows that dependencies exist between these two domains indicating a potential for cross-domain literature exploration. Bio3graph is a rule-based natural language processing tool which extracts relations in the form (subject, predicate, object) triplets. It is publicly available at http://ropot.ijs.si/bio3graph/software/. Copula analysis was performed by using Clayton and Frank fully nested copulas and the software is publicly available at: http://source.ijs.si/bmileva/copulasfordexapps.git.
    Keywords: triplets; relation extraction; modelling the domain dependence; copula functions.

  • Application of Multi-Verse Optimizer Based Fuzzy-PID Controller to Improve Power System Frequency Regulation in Presence of HVDC Link   Order a copy of this article
    by Nour E.L. Yakine KOUBA, Mohamed Menaa, Mourad Hasni, Mohamed Boudour 
    Abstract: This paper presents the design of a novel optimal fuzzy-PID controller based Multi-Verse Optimizer (MVO) for Load Frequency Control (LFC) of a two-area power system interconnected via High Voltage Direct Current (HVDC) transmission link. The MVO algorithm was adopted to estimate the unknown parameters of the test system and model the HVDC link for the LFC analysis, and then, was used to optimize the fuzzy-PID controller parameters including the scaling factors of fuzzy logic and the PID controller gains. To demonstrate the effectiveness of the proposed control strategy, a two-area power system with HVDC link connection was investigated for the simulation. The estimated unknown parameters of the simulated model were compared with other existing results obtained by simulating the same model with the Nonlinear Least-Squares Data-Fitting Algorithm (LSDFA) in MATLAB Optimization Toolbox available in literature. The system dynamic responses are obtained considering single, multi and dynamic load disturbance in both areas. A comparative study of performance of proposed controller, fuzzy logic and conventional PID controller was carried out. Furthermore, the robustness analysis of the proposed control strategy was also performed by varying the system parameters over the wide range from the nominal system values. The obtained results satisfy the LFC requirements and reveal that the optimized fuzzy-PID controller based MVO algorithm enhances power system frequency regulation in presence of HVDC link.
    Keywords: Multi-Verse Optimizer (MVO); PID Controller; Fuzzy Logic Controller (FLC); Load Frequency Control (LFC); HVDC Link.

  • EGSA: a New Enhanced Gravitational Search Algorithm to Resolve Multiple Sequence Alignment Problem   Order a copy of this article
    by Elamine ZEMALI, Abdelmadjid Boukra 
    Abstract: Multiple sequence alignment is a very important and usefulrntool for genomic analysis in many tasks in bioinformatics. However, findingrnan accurate alignment of DNA or protein sequences is very difficultrnsince the computational effort required grows exponentially with the sequencesrnnumber. In this paper, we propose a new sequence alignmentrnalgorithm based on gravitational search algorithm (GSA). The gravitationalrnsearch algorithm (GSA) is a recent metaheuristic inspired fromrnNewtons laws of universal gravitation and motion. Moreover, to avoidrnthe convergence toward local optima, we enhance GSA behavior by introducingrna new mechanism based on simulated annealing concept. Suchrnconcept offers a good balance between the exploration and exploitationrnin GSA and can lead to good alignment quality for the MSA problem.rnThe accuracy and efficiency of the proposed algorithm are comparedrnwith recent and well-known alignment methods using BAliBASE benchmarkrndatabase. The analysis of the experimental results shows that thernalgorithm can achieve competitive solutions quality
    Keywords: Multiple sequence alignment; gravitational search algorithm;rnMetaheuristics; Bioinformatics; simulated annealing.

  • Ant colony optimization combined with variable neighborhood search for scheduling preventive railway maintenance activities   Order a copy of this article
    by SAFA KHALOULI, RACHID BENMANSOUR, SAID HANAFI 
    Abstract: Railway infrastructure maintenance is of fundamental importance in order to ensure a good service in terms of punctuality, safety and efficiently operation of trains on railway track and also for passenger comfort. Track maintenance covers a large amount of different activities such as inspections, repairs, replacement of failed components or modules and renewals. In this paper, we address the problem of scheduling the preventive railway maintenance activities. The goal is to prevent track failure probability and breakdowns to guarantee a stable and safe service in specified conditions. These activities ensure the increasing of the system reliability and its availability but require considerable resources and large costs, which can be minimized by scheduling the maintenance operations. This problem is proven to be NP-hard, and consequently the development of heuristic and meta-heuristic approaches to solve it is well justified. Thus, we propose two meta-heuristics, a variable neighborhood search (VNS) and an ant colony optimization (ACO), based on opportunities to deal with this problem. Then, we develop a hybrid approach combining ACO with VNS. The performance of our proposed algorithms is tested by numerical experiments on a large number of randomly generated instances. A comparison with optimal solutions are presented. The results show the effectiveness of our proposed methods.
    Keywords: Preventive maintenance; Scheduling; Variable neighborhood search; Ant colony optimization; Local search; Rail transportation.

  • A Two-Stage Hybrid Method for the Multi-Scenarios Max-Min Knapsack Problem   Order a copy of this article
    by Mhand Hifi, Thekra Al-Douri 
    Abstract: In this paper, we propose a two-stage hybrid method in order to approximately solve the multi-scenarios max-min knapsack problem. The proposed method is based upon three complementaries stages: (i) the building stage, (ii) the combination stage and (iii) the two-stage rebuild stage. First, the building stage serves to provide a starting feasible solution by using a greedy procedure; each item is randomly chosen for reaching a starting population of solutions. Second, the combination stage tries to provide each new solution by combining subsets of (starting) solutions. Third, the rebuild stage tries to make an intensification in order to improve the solutions at hand. The proposed method is evaluated on a set of benchmark instances taken from the literature. The obtained results are compared to those reached by the best algorithms available in the literature. The results show that the proposed method provides better solutions than those already published.
    Keywords: Heuristic; combinatorial; knapsack; optimization.

  • Numerical Program Optimization by Automatic Improvement of the Accuracy of Computations   Order a copy of this article
    by Nasrine DAMOUCHE, Alexandre CHAPOUTOT, Matthieu Martel 
    Abstract: Over the last decade, guaranteeing the accuracy of computations relying onrnthe IEEE754 floating-point arithmetic has become increasingly complex. Failures, causedrnby small or large perturbations due to round-off errors, have been registered. To copernwith this issue, we have developed a tool which corrects these errors by automaticallyrntransforming programs in a source to source manner. Our transformation, relying onrnstatic analysis by abstract abstraction, operates on pieces of code with assignments,rnconditionals and loops. By transforming programs, we can significantly optimize thernnumerical accuracy of computations by minimizing the error relatively to the exact result.rnIn this article, we present two important desirable side-effects of our transformation.rnFirstly, we show that our transformed programs, executed in single precision, mayrncompete with not transformed codes executed in double precision. Secondly, we show thatrnoptimizing the numerical accuracy of programs accelerates the convergence of numericalrniterative methods. Both of these properties of our transformation are of great interest forrnnumerical software.
    Keywords: Program Transformation; Floating-Point Numbers; IEEE754 Standard;rnData-Types Format Optimization; Convergence Acceleration.

  • Hybrid approach using multi-criteria methods and mathematical programming for outsourcing logistic problem   Order a copy of this article
    by Nesrine Bidani, Hela Moalla Frikha 
    Abstract: The decision maker can meet difficult decision problems in the presence of a multiple criteria. Indeed, the choice of a provider is a multi-criteria decision problem. This is the case of this paper which allows solving a logistics outsourcing problem based on multi-criteria decision aid. Several multi-criteria methods require a direct providing of parameters so that the decision maker obtains an alternatives ranking such as PROMETHEE that is a method with multiple criteria. However, this task of direct and precise fixation of parameter values is quite difficult which poses the subjectivity problem of provided parameters. To reduce and overcome this problem, we propose a new multi-criteria approach which hybrids objective methods. However, this hybrid approach has some disadvantages. The results of the hybrid method are integrated in a mathematical program to choose a transport provider within the Tunisian Chemical Group (GCT) and determine the number of providers and optimal transported quantities.
    Keywords: PROMETHEE; AHP; revised AHP; Mathematical Programming; Hybrid approach; transport provider.

Special Issue on: Applications of Soft Computing and Intelligent Control

  • Analysis of Enhanced Complex SVR Interpolation and SCG-based Neural Networks for LTE Downlink System   Order a copy of this article
    by Anis CHARRADA 
    Abstract: In this article, we operate and evaluate the performance of Radial Basis Function(RBF)-based Support Vector Machine Regression (SVR) and Scaled Conjugate gradient Backpropagation (SCG)-based Artificial Neural Network (ANN), to estimate the channel deviations in frequency domain using the standardized pilot symbols structure for LTE Downlink system. We apply complex SVR and ANN to estimate the real vehicular A channel environment well-defined by the International Telecommunications Union (ITU).rnThe suggested procedures use data obtained from the received pilot symbols to estimate the overall frequency response of the frequency selective multipath fading channel in two stages. In the first stage, each technique learns to adjust to the channel fluctuations, then, in the second stage, it predicts all the channel frequency responses. Lastly, in order to assess the abilities of the considered channel estimators, we deliver performance of complex SVR and ANN, which are compared to traditional Least Squares (LS) and Decision Feedback (DF) methods. Computer simulation results demonstrate that the complex RBF-based SVR approach has a better precision than other estimation methods.
    Keywords: SVR; SCG; RBF; ANN; OFDM; LTE.

  • Software Fault Prediction using Firefly Algorithm   Order a copy of this article
    by Ishani Arora, Anju Saha 
    Abstract: Software fault prediction models have enriched the quality analysts with prior information in hand indicating the fault prone modules detected during the early software development lifecycle. This has enabled the software organisations to focus the resources on the vulnerable modules and hence, deliver a low cost, maintainable and quality product to its customers. The software fault prediction literature has shown an immense growth of the research studies involving the artificial neural network based fault prediction models. However, the default gradient descent back propagation learning algorithm used in artificial neural networks show a high risk of getting stuck in the local minima of the search space. A class of nature inspired computing methods, being stochastic and non-gradient descent based, overcome this disadvantage of the back propagation optimisation method. This feature of the nature inspired techniques have helped artificial neural networks to evolve into a class of adaptive and optimised neural network. In this work, we propose a hybrid software fault prediction model built using firefly algorithm (FA) and artificial neural network (ANN). It also performs an empirical comparison of the classification performance of the developed model with the genetic algorithm (GA) and the particle swarm optimisation (PSO) based evolutionary methods in optimising the connection weights of a neural network. Seven different datasets from the PROMISE repository were involved in the experiments and mean square error (MSE) and the confusion matrix parameters were used for performance evaluation. The results have shown that FA-ANN model has performed better than the genetic and particle swarm optimised ANN fault prediction models.
    Keywords: artificial neural network; firefly algorithm; genetic algorithm; metaheuristic techniques; optimisation; particle swarm; software fault; software fault prediction; software quality; software testing.

  • A very low speech model based on frequency selection-GA approach   Order a copy of this article
    by Lahcène MITICHE, Amel Baha Houda ADAMOU-MITICHE 
    Abstract: Using a new model order reduction based on frequency selection and optimal genetic algorithm, a second order speech model is calculated. In our algorithm, the modeling process starts with a full-order classical all poles model obtained by Burg method. The full order model AR is reduced using the proposed approach based on the genetic algorithms and the original speech production system dominant frequencies. The model reduction yields to a ARMA second order model which interestingly preserves the key properties of the original fullorder model in the time and frequency domains. To illustrate the performance and the effectiveness of the proposed approach, some computer simulations are conducted on some practical speech segments. To show the novelty of our approach, a comparative study with an approximant given by the robust SV D −Schur technique is presented.
    Keywords: AR model; ARMA model; genetic algorithm; model order reduction; SVD−Schur technique; speech model; pole selection; ISE criterion.

  • An Improved Quantum Particle Swarm Optimization and Its Application on Hand Kinematics Tracking   Order a copy of this article
    by Zheng Zhao, Naigong Yu 
    Abstract: The evolutional motivated particle swarm optimization (PSO) has been widely employed in various scientific areas, and there has been plenty of contribution on the modification and improvement of PSO. Recently, a quantum behaviour inspired optimization algorithm (QPSO) was developed by modelling a Delta potential well in quantum space, which shows better performance in global search ability and convergence precision compared with the original PSO algorithm. In this paper, based on the principle of QPSO, we proposed a dynamic search strategy fused with chaos map to strengthen the ability of escaping from local optima, and replaced the attractor with beta distribution for faster convergence speed. We first compared this improved algorithm (DCQPSO) with PSO and QPSO on general optimization benchmark functions. Then, from the point view of application, we also achieved a simplicity-oriented human hand kinematics tracking system by utilizing DCQPSO, which can be further served in human computer interaction (HCI). Indicated by the experiments result, DCQPSO outperforms either traditional PSO or QPSO algorithm, and it can be well qualified with optimization task in hand kinematics tracking,
    Keywords: PSO; QPSO; Chaos; Optimization; Hand Kinematics Tracking; HCI.

  • General study for energy recovery from used batteries using Fuzzy logic and PI controllers   Order a copy of this article
    by Jabrane Chakroun 
    Abstract: In this paper, we propose a special design for energy recovery from used batteries into a renewable power source. The proposed technique consists in designing and implementing Proportional-Integral (PI) and Fuzzy-Logic (FL) controllers to ensure a high ability of conversion. The suggested controllers are designed specially to adapt the dynamic aspect of the batteries charge and discharge. To obtain the optimum residual energy, both methods are compared using MATLAB- SIMULINK. In our case, it turned out that the Fuzzy-Logic controller method improves the performance and the rapidity of the system.
    Keywords: PI Controller; Battery; Fuzzy Logic Controller; Buck converter; Residual Power; Simulink.

  • Application of firefly algorithm for congestion management problem in the deregulated electricity market   Order a copy of this article
    by A. AHAMED JEELANI BASHA, M. ANITHA, E.B. ELANCHEZHIAN 
    Abstract: In the deregulation of electricity market, the transmission Congestion Management (CM) has become extremely important in order to ensure security and reliability of the system. This paper proposes a method to manage congestion by optimal rescheduling of the active powers of generators based on Firefly Algorithm (FA). However, all generators in the system need not take part in CM. Thus, in this article, generators are selected based on the magnitude of generator sensitivities to the congested line. In this paper, the proposed FA is tested on standard IEEE 30 bus, 118 bus systems and a practical Indian utility 62 bus system for the solution of CM problem. The results of these test systems provide minimum rescheduling cost and are compared with that of CPSO, PSO-TVIW, PSO-TVAC, VEPSO and PSO-ITVAC methods. Results prove that FA is indeed capable of getting a high quality solution for the CM problem.
    Keywords: Firefly algorithm; Generator sensitivity factor; Congestion management; Deregulated power market.

  • Multi Agent model based on combination of Chemical Reaction Optimization metaheuristic with Tabu Search for Flexible Job shop Scheduling Problem   Order a copy of this article
    by Bilel Marzouki, Olfa Belkahla Driss, Khaled Ghédira 
    Abstract: Scheduling in production systems consists in assigning operations on a set of available resources in order to achieve defined objectives. The Flexible Job shop Scheduling Problem (FJSP) is one of the scheduling problems and also an extension of classical job shop scheduling problem such that each operation can be processed on different machine and its processing time depends on the used machine. This paper proposes a multi-agent model based on combination of chemical reaction optimization metaheuristic with tabu search to solve the FJSP in order to minimize the maximum completion time (makespan). To evaluate the performance of our model, experiments are performed on well known benchmark instances proposed in the literature and comparison are made with others approaches of literature.
    Keywords: Manufacturing; Production System; Industrial Engineering; Scheduling; Optimization; Flexible job Shop Problem; Artificial Intelligence; Multi Agent System; Chemical Reaction Optimization metaheuristic; Tabu search; Decision Making; Metaheuristic; Hybridization.

  • Performance Improvement of the Particle Swarm Optimization algorithm for the Flexible Job Shop Problem under Machines Breakdown   Order a copy of this article
    by Rim Zarrouk, Imed E. Bennour, Abderrazek Jemai 
    Abstract: One of the most challenging problems in manufacturing field is to solve the flexible job shop problem (FJSP) subject to machine breakdowns (caused a loss of time). The meta-heuristic particles swarm optimization (PSO) is well suited to solve the FJSP but it might be time consuming specially on monocore platforms. In this paper, we propose a set of PSO-FJSP variants that aim to improve the run time of the prescheduling step. Then we propose three rescheduling variants to handle machine breakdowns: two variants aim to improve the robustness of the schedule, while the third aims to improve the stability of the schedule. Standard benchmarks are used to evaluate and compare the proposed variants.
    Keywords: Flexible job shop problem; swarm optimization; scheduling; performance; Machine breakdowns.

  • FFA Based Speed Control of BLDC Motor Drive   Order a copy of this article
    by MANOJ KUMAR MERUGUMALLA, PREMA KUMAR NAVURI 
    Abstract: This paper presents the speed control of brushless direct current (BLDC) motor drive using nature-inspired algorithm. These algorithms can be applied to any virtual problem that can be treated as an optimization task. The proposed design problem of speed controller is formulated as an optimization problem and firefly algorithm (FFA) is employed to search for optimal Proportional-Integral-Derivative (PID) parameters of speed controller by minimizing the time domain objective function. The performance of the proposed FFA-PID controller for the speed control of brushless direct current motor has been tested under sudden change of set-point, load torque for below rated speed and rated speed, the performance of the proposed controller is compared with bat algorithm based controller BA-PID. The brushless direct current motor drive has been simulated using MATLAB/SIMULINK and simulation results demonstrate the effectiveness of the proposed algorithm in controlling the speed of motor drive when compared with bat algorithm through some time domain parameters.
    Keywords: Brushless direct current motor; Sensorless control; Firefly Algorithm; Bat Algorithm; PID Controller; Optimization.

  • Fault detection and isolation of asynchronous machine based on the probabilistic neural network (PNN)   Order a copy of this article
    by Rahma Ouhibi 
    Abstract: In this paper, we propose three neural networks based methods for fault detection and isolation of asynchronous machine: a probabilistic neural network (PNN), Multi-Layer Perceptron (MLP), and Generalized Regression Neural Network (GRNN). To perform efficient diagnostic results the cross-validation procedure input data is partitioned into three sets: a training set, a validation set and a test set. The stator RMS values of three-phase voltages and currents are used as model inputs to identify the different types of faults and the normal operating mode. Efficiency of these three neural based methods is compared using a test set of 100 data.
    Keywords: synchronous machine; fault detection and isolation (F.D.I).; artificial intelligence; probabilistic neural network (PNN); multi layer perceptron (MLP); Generalized Regression Neural Network (GRNN). Generalized Regression Neural Network (GRNN).