International Journal of Computational Systems Engineering (12 papers in press)
An Investigation into the Provision of a Decision Support System to Evaluate Software Performance under Uncertainty.
by Md.Mahashin Mia, Mohammad Shahadat Hossain, Rashed Mustafa, Atiqur Rahman
Abstract: The performance of the software is disturbing, which has the capability to stop the everyday life activities of a certain area. Therefore, an earlier prediction of software performance could play an important role to save human time as well as daily life activities. The signs of efficiency along with coverage and reliability in the system could be considered as a way to predict software performance. These factors cannot be determined accurately because of the presence of different categories of uncertainties. Therefore, this article presents a belief rule-based expert system (BRBES) that has the capability to predict software performance under uncertainty. Historical data of various software performances of the world with specific reference to efficiency as well as coverage and reliability have been considered invalidating the BRBES. The dependability of our proposed BRBESs output is measured in comparison with Fuzzy Logic Based Expert System (FLBES) and Artificial Neural Networks (ANN) based system, whereas our BRBESs results are found more reliable than that of FLBES and ANN. Therefore, this BRBES can be considered to predict the incidence of software performance in an area by taking account of the data, related to the efficiency, coverage, and reliability.
Keywords: Software; Uncertainty; Prediction; Expert system; Belief rule base.
Validation of non-functional scalability requirement in the development of Versat Sarasola software
by Yuliet Fernández Lavalle, Zoila Esther Morales Tabares
Abstract: All software that works with databases achieves one of its major objectives if they have the non-functional requirement of scalability. Financial accounting technology achieves a great goal by working with scalable software programs, but it does not guarantee quality in them. The Cuban accounting software system Versat Sarasola, despite being certified, is a scalable software with low quality for clients and specialists. This article describes a testing strategy for the non-functional requirement of scalability in the development of Versat Sarasola software, based on a set of international standards and models, with the aim of obtaining a correct and adequate quality in the Versat Sarasola software managing the scalability in its development.
Keywords: software; technology; non-functional requirement; scalable; quality.
Analyze K-Means Algorithm and Implementing a New Clustering Algorithm
by H. Parthasarathi Patra, K. Nikitha
Abstract: Clustering is a technique of machine learning which involves grouping data points. We can use clustering algorithm to cluster each data points into specific group. K-means algorithm is mostly used and famous algorithm for analysis of clusters. The K-means clustering algorithm is a method of partitioning clusters that partition data objects into k different clusters. However the main drawback of this algorithm is that classical k means algorithm it is mainly sensitive to initial centroids and it is also difficult to determine the total number of clusters. So in order to improve the performance of K-means algorithm, this paper presents a new modified k means method which uses intra class distance to cluster the data.
Keywords: Machine learning; Clustering analysis; K means.
Stock Price Prediction using Historical Data and News Articles: A Survey
by Vijay Dwivedi, Manoj Madhava Gore
Abstract: Stock traders predict the price of a stock to maximize their trading profit in the stock market. Predicting a stock price is a complicated task as the price of a stock changes frequently and abruptly. The volatility of stock price is affected by various factors of social, economic, and political nature. The study of literature on stock price prediction reveal that existing models perform prediction utilizing datasets of historical data, or news articles, or both. The major stock price prediction techniques are fundamental analysis, technical analysis, machine learning and ensemble approaches.Existing prediction techniques predict the closing price of a stock after the closing of Sensex at each trading day. The predicted price is not always useful for stock traders for accomplishing their objectives. Numerous research has been done by various researchers to accurately predict the price of a stock in a consistent manner. However, there is still a scope of improvement in this area. Traders always require an efficient stock price prediction, which could predict the price of a stock with an increased degree of accuracy and consistency within a stipulated amount of time. This article reviews several approaches employed for prediction of stock price utilizing historical data, news articles,or both. The article also highlights the various open research issues and challenges, which may be helpful to the interested researchers.
Keywords: Historical Data of Stock Price; Ensemble; Machine Learning; News Article;rnStock Price Prediction.
An API-intermediation system to facilitate data circulation for public services: the French case study
by Christophe GAIE
Abstract: Abstract Nowadays, the circulation of data between government administrations remains a key concern to improve the efficiency of public services. The Once Only Principle was introduced to reduce the complexity of administrative procedures by reducing the documents to supply. However, its implementation remains painful as it requires to connect a large number of entities with various level of IT expertise. Thus, the proposal describes the interest towards introducing a simple intermediation system (based on REST API ) to simplify data circulation and facilitate e
Keywords: e-Government; Public services; Once Only Principle; Intermediation; API; REST; Simplification.
Addressing Long tail problem in Music Recommendation Systems
by Sunitha , Adilakshmi Thondepu
Abstract: Music Recommendation Systems (MRS) are the information filtering tools used to handle information overloading problem in music field. Collaborative Filtering (CF) is the most frequently used approach to provide recommendations. Even though CF is very simple and popular but faces the problem of popularity bias. Research to discover the songs which are not popular but might be interesting to a user is an interesting direction in music recommendation systems. This paper proposes a Multi-stage graph- based method and KNN based method to identify and recommend less popular songs which are also known as long tail songs. MSG_WEIGHTS finds the recommendation vector based on the weights. Two variants MSG_KNN, MSG_K-Means are proposed to identify Tail songs. Second method applies KNN to identify relatively less frequent songs for recommendation. Results obtained show that proposed methods are able to identify novel songs from the tail for recommendation.
Keywords: Music Recommendation Systems; Information overloading; Long Tail;Multi-stage graph; Head; Mid; Tail.
Machine Learning-Based Software Requirements Identification for A Large Number of Features
by Pratvina Talele, Rashmi Phalnikar
Abstract: Software is extremely important in today's market. The complexity of software identification is a serious requirement engineering problem. As the number of software requirements (SR) for software increases, conflicts arise in categorizing SR and necessitating the use of an intelligent techniques to discover and fix inconsistencies. The aim of this study is to compare the existing Machine Learning (ML) algorithms to understand that which of the existing ML algorithms is likely to identify the SR efficiently. Different natural language processing methods are used for text preprocessing phase and Term Frequency-Inverse Document Frequency is used for feature extraction phase. We employ ML algorithms on the dataset used to identify the requirements and extracted from publicly available SRS and empirically analysed to show that they are successful in identifying SR. Inconsistencies are found and rectified using the different ML methods. Furthermore, our study aids in identifying discrepancies during classification of software requirements.
Keywords: Software Requirements; Machine Learning.
Cognitive Radio Enabled Smart Meter Based on Real Time Pricing System using Hybrid FPA-GSA Algorithm
by Deepa Das
Abstract: The spectrum scarcity issue in cognitive radio (CR) enabled smart grid (SG) heterogeneous network is significantly improved by facilitating all the gateways and the smart meters present at the consumers side with cognitive radio (CR) technology. However, such architecture requires comparatively more power for its operation which leads to excessive power tariff. To resolve this problem, an adaptive resource allocation scheme is proposed based on a metaheuristic algorithm for maximizing aggregate profit of the network while considering association of all the users. To achieve this, a hybrid Flower Pollination Algorithm- Gravitational Search Algorithm (Hybrid FPA-GSA) is proposed for optimizing electricity price and transmission power allocated to the consumers simultaneously which is suitable for both consumers and the supplier. The efficacy of the proposed Hybrid FPA-GSA is verified over several benchmark functions, and then applied on the system model. Further, the effect of different parameters on the system performance is also studied.
Keywords: Smart grid; home area network; cognitive radio; aggregate profit; Hybrid Flower Pollination Algorithm-Gravitational Search Algorithm.
Special Issue on: ISPR 2020 Recent Advances in Intelligent Systems and Pattern Recognition
Structural Refinement of Manually created Bayesian Network for Prostate Cancer Diagnosis
by NAVEEN KUMAR BHIMAGAVNI, ADILAKSHMI THONDEPU
Abstract: In general, Structure of a Bayesian network can be learnt from thernavailable data. In some domains like medicine, Bayesian network can be manuallyrncreated by domain experts and statistical methods can be applied to refine thernstructure based on the data. As the data is continuously getting evolved in many realworld applications, refinement of expert network structure is unavoidable. Existingrntechniques refine the manually constructed Bayesian network either by verifying thernrelation of a node with the remaining nodes in the network (Expert Bayes) or byrnexamining a node only with its parents (MDL Principle). In this work, we propose anrnalgorithm that verifies relation of a node only with its non-descendant nodes that arernidentified with Markov Assumption. Proposed Algorithm performs small changes tornthe original network and proves that a smaller number of operations are required tornfind the best network structure. Maximum Likelihood Estimation (MLE) isrnconsidered as a scoring function to calculate score for each candidate structure andrnselects the network with the highest score. Manually created Bayesian Network hasrnbeen collected for the widespread disease Prostate cancer and proposed algorithm refines thernnetwork structure.
Keywords: Bayesian network; Prostate cancer; Markov Assumption; Maximum Likelihood Estimation (MLE); Probabilistic Graphical Model (PGM); Refinement algorithm.
Selection of statistical Wavelet features using wrapper approach for Electrical Appliances identification based on KNN classifier combined with voting rules method
by Ghazali Fateh, Abdenour Hacine Gharbi, Philippe Ravier
Abstract: This work is an extended version of paper presented in the international conference on Intelligent Systems and Patterns Recognition where the authors have proposed a compact features representation based on the estimation of statistical features using discrete wavelet transform for electrical appliances identification based on K Nearest Neighbor classifier combined with voting rule strategy. The results have shown that the Wavelet Cepstral Coefficients (WCC) descriptor presents highest performance with 98.13% classification rate (CR). In this work, we propose many extensions: (i) the logarithm energy (LOG_E) is used as additional descriptor; (ii) the relevance of the wavelet based features combined with LOG_E descriptor is investigated using feature selection based on wrappers approach; (iii) deep performances evaluation is carried out using five additional metrics. The results show that the selection of four features of WCC combined with LOG_E improves the CR at 98.51%
Keywords: Electrical Appliances Identification; Statistical Feature Extraction; Discrete Wavelet Analysis; K Nearest Neighbor classifier; voting rule method; wrapper feature selection approach.
Finger Vein Biometric Scanner Design Using Raspberry Pi
by Sara Daas, Amira Yahi, Mohamed Boughazi, El-Bay Bourennane
Abstract: Finger vein biometric systems gained a lot of attention in recent years due to the increasing demand for high-security systems. The biometric device captured the human finger vein image and used it for security such as authentication, verification and identification. Most of the existing finger vein capturing devices are not suitable for any research, development because of their private verification software. For that reason, this paper focuses on designing and developing a finger vein biometric system based on an Arduino and Raspberry Pi board. The proposed finger vein device is based on near-infrared light (NIR). The Arduino Microcontroller is used to automatically control the brightness and determine the impact of NIR lighting on the captured images Raspberry Pi board commanded all external peripherals of the system. The effectiveness of the proposed design has evaluated using objective Image Quality Assessment (IQA) metrics, i.e. MSE, PSNR, IQI, AD, NK, SC, MD, LMSE and NAE. Experimental results improve high performance with an MSE increase of 61.39% and an important PSNR reaching 33.73% compared with the existing state-of-the-art designs.
Keywords: Finger vein; Arduino; Raspberry Pi; near-infrared light; two-dimensional entropy; PWM; Image Quality Assessment (IQA).
Combination of a DAE-CNN and OC-SVDD for Intrusion Detection
by Hamza Frihia, Halima Bahi, Djamel Eddine Mahrougui
Abstract: The extensive use of the Internet has favoured the emergence of intrusion detection systems (IDSs) that scan the network traffic to detect potential attacks. The detection of malicious events requires the learning of the patterns representing the attacks; meanwhile, new threats appear regularly. Thus, it is of amount importance to develop intrusion detection systems that do not depend on malicious patterns. In this paper, we leverage advances made in deep learning and in the one-class classification approach to build an IDS. The proposed IDS is based on the use of a deep auto-encoder (DAE) to extract robust features from an event description, and the use of the One-Class Support Vector Data Description (OC-SVDD) method, a modified version of the well-known OC-SVM (One-Class Support Vector Machine), to detect the intrusion; the DAE layers consist of convolution layers. The DAE is trained exclusively on normal patterns and is expected to extract robust features representing the normal traffic. The OC-SVDD is trained based on these features, thus, during the test stage, malicious events are classified as outliers. We report experiments on the well-known NSL-KDD dataset. The experimental results show an accuracy of about 97.73% and prove the potential of the proposed approach to distinguish between normal and malicious traffic.
Keywords: Computer security; Intrusion Detection System; One Class Support Vector Data Description (OC-SVDD); Deep AutoEncoder; Convolutional Neural Network; NSL-KDD dataset,.