Forthcoming articles

International Journal of Computational Systems Engineering

International Journal of Computational Systems Engineering (IJCSysE)

These articles have been peer-reviewed and accepted for publication but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.

Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.

Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.

Register for our alerting service, which notifies you by email when new issues are published online.

Open AccessArticles marked with this Open Access icon are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.
We also offer which provide timely updates of tables of contents, newly published articles and calls for papers.

International Journal of Computational Systems Engineering (8 papers in press)

Regular Issues

  • Towards Recent Developments in the Methods, Metrics and Datasets of Software Fault Prediction   Order a copy of this article
    by Deepak Sharma, Pravin Chandra 
    Abstract: The world of software systems is amplified with the changing environment magnifying the demand for quality software. Software fault prediction is a requisite activity ensuring the development of economic, efficient and quality software. It is the procedure for the development of models which help to identify faults in modules during early phases of software development lifecycle. Software fault prediction is one of the most prevalent research disciplines. The existing study in this domain includes numerous modeling techniques and software metrics for the early predictions of software faults. This paper aims to explore some of the prominent studies for software fault prediction in the existing literature. In this paper, software fault prediction papers since 1990 to 2017 are investigated. The paper includes the analysis of the studies having empirical validation and a good source of publication. The paper reflects the methods, metrics, and datasets available in the literature for software fault prediction. In addition, the modeling techniques based on traditional and computational intelligence based methods are also reviewed. This paper is an endeavor to assemble the existing techniques and metrics of software fault prediction with a motive to assist researchers for easy evaluation of suitable metrics for their own research scenarios.
    Keywords: Software Fault Prediction; Fault Tolerance; Computational Intelligence; Software Metrics; Evaluation Metrics.

  • Comparing Robustness of Realised Measures under Round-off Errors, Price Adjustments and Serial Correlations: A Simulation Study   Order a copy of this article
    by Hiroumi Misaki 
    Abstract: We compare the accuracy of realised measures using a number of computer simulations. Realised measures are the methods used to estimate the integrated volatility from high-frequency data. We consider a simple realised volatility (RV), a 5-minnute RV, a subsampled 5-minute RV, a two-scale estimator (TS), a realised kernel (RK), a pre-averaging estimator (PA) and a separating information maximum likelihood estimator (SIML). We used seven market microstructure models, which included round-off errors, price adjustments and serial correlation. The SIML is not irrationally biased in any case; this implies that the SIML is sufficiently robust to the market microstructure noise in any form. We have also found that the SIML is the only realised measure for maintaining consistency in all our simulations. We conclude that SIML is suitable for practical applications.
    Keywords: finance; high frequency data; decision making; realised measures; volatility estimation; robustness; market microstructure noise; round-off; price adjustments; serial correlations; simulation study; high performance computing; separating information maximum likelihood; SIML.

    by Shikha Singla, Gaurav Gupta 
    Abstract: Mutual funds are excellent medium for investors to invest, who dont have much know-how about financial markets. Investors generally take Funds historical performance and their funds rating as harbinger for its performance prediction. This myopic selection and prediction criterion sometimes leads to wrong funds allocation and hence poor portfolio performance. Performance prediction depends upon number of controlled and uncontrolled factors and very difficult to predict it precisely. However In past, there had been many studies which tried to predict the performance by using various statistical techniques. This review paper covers various different techniques to check the funds performance. Through this survey it is found that a lot of work can be done in the field of performance analysis of mutual fund by taking different factors into consideration. In end, this paper gives a brief literature review of mutual funds performance prediction models.
    Keywords: Mutual fund; performance evaluation; Net Asset Value (NAV); Data Envelopment Analysis (DEA); Back Propagation Network (BPN).

  • Innovative Study of EOQ (Economic Order Quantity) Model for Quadratic Time- Linked Demand under Tolerable Delay in Payments with Inconsistent Holding Cost and Associated Salvage Value   Order a copy of this article
    by R.P. Tripathi, Shweta Singh Tomar 
    Abstract: In this study, an effort is prepared to distinguish inventory model in favor of fading objects with changeable holding price as the seller proposes an acceptable delay in expenses to settle account against purchase for quadratic time induced demand. In the majority of inventory models authors have well thought-out that holding cost is stable, but in actual practice is not always true. In this manuscript, holding cost is measured linearly time - sensitive. An algorithm is exhibited for a seller to decide most favorable cycle time that minimizes the whole inventory cost. Arithmetical examples and sensitivity assessment are discussed to demonstrate theoretical results. First order approximation is used for exponential terms.
    Keywords: Deterioration; time-dependent holding cost; permissible delay; quadratic demand rate; salvage value.

  • A Survey on Effects of Class Imbalance in Data Pre-processing Stage of Classification Problem   Order a copy of this article
    by Nitin Malave, Anant Nimkar 
    Abstract: Classifier learning with data-sets suffering from imbalance class distribution is a challenging task and it hinders the performance of machine learning algorithms. This imbalance occurs when a particular class is highly outnumbered than that of another class. Such kind of data distribution in the real world applications caught the attention of many researchers. This paper presents the review of various state of the art sampling techniques and ensemble techniques to resolve class imbalance. Classification task in imbalance domain can be a binary or a multi-class classification problem. This paper discusses the techniques incorporated on a binary classification problem. There are various other factors such as threshold of distribution, inter or within class imbalance etc, that make class imbalance a more complex issue. Threshold is a way to provide significance of imbalance among the classes. This threshold has normal range of 1:9 for minority to majority class. Different datasets can have different range as per distribution. Various techniques have been used throughout the literature to alleviate the class imbalance problem which includes data sampling, cost sensitive methods, bagging, boosting etc. Comparison of various approaches have been shown in the literature with their advantages and disadvantages. Different parameters used for evaluating model for performance measure have been reviewed. Accuracy is majorly used as evaluation parameter in machine learning problems, but from reviews it is found that there are different parameters such as precision, recall and AU-ROC which provide statistical measures for evaluating the model. The paper also gives research directions in the domain of Class Imbalance Problems.
    Keywords: Machine Learning; Class Imbalance; Rare Event Detection; Classification; Resampling Techniques.

  • From secured Legacy systems to interoperable services (The careful evolution of the French Tax Administration to provide new possibilities while ensuring the primary tax recovering objective)   Order a copy of this article
    by Christophe GAIE 
    Abstract: The purpose of this paper is two-fold. One, the author describes the interest of opening Legacy systems in large organization instead of replacing them from scratch. A review of similar approaches in the literature is also provided and a concrete method based on the combination of REST architecture and Legacy systems is proposed. Two, the author provides a feedback on the different REST solutions available to facilitate their usage by Information Technology (IT) architects. The paper also points out the importance not only to preserve the good functioning of the Legacy heritage but also to migrate progressively applications to modernized languages. Assuredly, the code developed is robust, tackles the whole business perimeter and is maintained by experts whereas new technologies may suffer from a lack of stability and/or technical expertise within the organization. This advocates for a progressive migration from Legacy to modern applications, especially in the specific context of essential public services. The paper finally details a method to perform efficiently the migration by introducing data exchange between the Legacy and modern parts of the hybrid architecture during the migration. It also describes a method which can be useful to select an API management solution suited to the particularity of reader organization.
    Keywords: API management; decoupling; IT migration; webservices; Service-oriented architecture (SOA); REST architecture; Legacy modernization; large organizations.

  • Protecting Child on the Internet using Deep Generative Adversarial Networks   Order a copy of this article
    by Sabira Ojagverdiyeva 
    Abstract: In this paper to provide sanitization of harmful information, a Generative Adversarial Network (GAN) is used. An approach consisting of two blocks implementing sanitization of data to control children's access to malicious (harmful) information on the Internet is proposed in the article. The first block of the approach is a generator that contains an autoencoder deep neural network and the second one is a discriminator which contains a logistic regression classifier. According to the proposed approach, the autoencoder inside the generator block, by adding some noise, implements the transformation of sensitive attributes into non-sensitive, which are considered to be dangerous for children, and the logistic regression inside the discriminator block, realizes the classification of the transformed data. The purpose of the anonymizer (generator) here is to minimize the recognition efficiency of the classifier, by transforming malicious content into non-malicious content. To maintain the usefulness of information during the transformation of data, the privacy and utility rates of the sanitized data are measured. Expected risks and the optimal consensus between these two parameters are achieved with the application of the minimax algorithm. As a result of experiments on synthetic data, the classification algorithm performs the recognition of the class of sensitive data with low accuracy and the class of non-sensitive data with high accuracy.
    Keywords: child protection; data sanitization; autoencoder; deep learning; Generative Adversarial Networks.

  • Question Answering System for Agriculture Domain using machine learning techniques: literature survey and challenges   Order a copy of this article
    by Prashant Niranjan, Vijay Rajpurohit 
    Abstract: Natural language processing (NLP) is a part of artificial intelligence & computer science. Question answering system that provides an interaction between computers and human languages, it is role to program the computers to process and analyze the large amounts of human language data. It is one of the important aspect in all domains and helps in every domain to satisfy the people requirements. Now a days almost all people are literate and using the mobile phone to receive the up to date information as per their requirement. QAS can be used to provide succinct information for the questions that are being asked by user and it provides answers to users based on some rules which are stored in the data base. This survey paper details about what is question answering system and its previous related work with respect to methods, technologies or approaches that were used. It provides research gaps and future scope to the researches in the reviewed papers, which helps researchers to choose a suitable solution to their problems. Wherein available comparative analyses have been provided.
    Keywords: Question-answering; QAS; Natural language processing; Answer Extraction; human-computer interaction; Artificial intelligence.