Forthcoming articles

International Journal of Enterprise Network Management

International Journal of Enterprise Network Management (IJENM)

These articles have been peer-reviewed and accepted for publication but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.

Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.

Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.

Register for our alerting service, which notifies you by email when new issues are published online.

Open AccessArticles marked with this Open Access icon are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.
We also offer which provide timely updates of tables of contents, newly published articles and calls for papers.

International Journal of Enterprise Network Management (38 papers in press)

Regular Issues

  • Performance and Emission Characteristics of a DI Diesel Engine using Diestrol Blends and diesel as fuel   Order a copy of this article
    by BIBIN CHIDAMBARANATHAN, Seeni Kannan Pauldurai, Devan Ponnusamy Kumarasami, Rajesh Rajamoni 
    Abstract: Biofuels, namely, biodiesel and ethanol produced from renewable energy sources are used as fuels in the blended form along with diesel to investigate the performance and emission characteristics of a DI diesel engine. Diestrol blend consists of diesel, biodiesel/methyl ester and ethanol. In diestrol blends, ethanol percentage is steadily elevated with an incremental factor of 5% culminating into three blends with a maximum percentage of 15% by volume and named as EB5, EB10 and EB15 respectively. A comprehensive analysis of engine performance characteristics such as brake thermal efficiency, brake specific fuel consumption, exhaust gas temperature and emission characteristics such as Carbon monoxide, Carbon dioxide, unburned hydrocarbon, oxides of nitrogen and Smoke opacity were carried out. From the above investigation, it was found that brake thermal efficiency increased by 3%, 5% and oxides of nitrogen emission decreased by 23%, 24.5% when compared to diesel and B20 respectively.
    Keywords: diesel engines; diestrol; ethanol; emissions; methyl esters; punnai oil; performance; ternary blends.

  • Prediction of carotid atherosclerosis in patients with impaired glucose tolerance A performance analysis of machine learning techniques   Order a copy of this article
    by Maruthamuthu A, M. Punniyamoorthy, Swetha Manasa Paluru, Sindhura Tammuluri 
    Abstract: The focus of this paper is to examine factors associated with carotid atherosclerosis in patients with impaired glucose tolerance (IGT), and to predict the rapid progression of carotid intima-media thickness (IMT). The proposed machine learning methods performed well and accurately predicted the progression of carotid IMT. The linear support vector machine, non-linear support vector machine with a radial basis kernel function, multilayer perceptron (MLP), and Naive Bayes method were employed. A comparison of these methods was conducted using the Brier score, and the accuracy was tested using a confusion matrix.
    Keywords: Multilayer Perceptron; Support Vector Machine; Radial Basis Kernel function; Impaired Glucose Tolerance; Carotid Atherosclerosis; Naive Bayesian model; Brier Score.

  • Do you gain by Green Supply Chain Management (GSCM)?   Order a copy of this article
    by Balakrishnan AS, Jayshree Suresh 
    Abstract: The importance of green has increased due to the environmental change. The burning of oil and other fossil fuels releases carbon dioxide, which rises, blankets the earth and traps heat. On environmental issues, there are intensive studies which have been dealt with extensively by practitioners and academicians. There is an increasing pressure on businesses to improve economic and environmental performance. Green supply chain management (GSCM) is an emerging approach for economic and ecological benefit to manufacturers. This paper presents the case study on how GSCM practiced in Ford India in the areas of logistics, packaging and manufacturing processes, how GSCM influence with firm performance, and its gain by extending across firms in developing markets such as India.
    Keywords: Green supply chain management; GSCM; logistics; manufacturing; packaging; India.

  • An Analytical Study of Lean Implementation Measures in Pump Industries in India   Order a copy of this article
    by Mohan Prasad M, Ganesan K, Paranitharan K.P, Rajesh R 
    Abstract: The manufacturing industries in India are gearing up to face the challenges namely the quality, timely delivery and satisfying customer need in the international market. This prompted some large manufacturing industries to implement lean thinking in their manufacturing process. Most of the manufacturing companies are yet to take up this task. Particularly, the pump manufacturing industries which are mostly occupied by SMEs are still to follow the suit. In this context, this research study has made sincere attempt to survey the implementation of Lean in pump manufacturing industries in India through an instrument consisting of seven lean implementation measures namely, Reasons to Implement Lean Practice, Lean Tools/concepts Employed in the Company, Reasons for Low Priority Towards Lean Implementation, Major Barriers in Lean Practices, Evaluation of Level of Waste in the Company, Success Factors of Lean Practicing in the Company and Lean Performance Indicators. A survey type research was conducted and the results indicated that identified lean implementation measures were found to be significant in achieving lean implementation in pump manufacturing industries.
    Keywords: Lean Practice; Lean Implementation Measures; Pump Manufacturing.

  • COMBINED STATIC ECONOMIC & EMISSION DISPATCH BY IMPROVED MOTH OPTIMISATION WITH VALVE POINT LOADING   Order a copy of this article
    by Vennila H, Rajesh R 
    Abstract: The need for electricity is the defining feature of the modern age. As the demand increases, so must the production. Thus, the size of power systems grows day by day. The problem of deciding the contribution of each generator in a power system is a complex one. Where some generators may be more fuel efficient, others may have a cleaner operation. The special methods like Valve Point Loading which help further improve efficiency, it is necessary to find new and better ways to determine the power to be supplied by each generator in a power system. This paper aims to find an optimum solution for the problem by the use of an algorithm inspired by the flight pattern of moths. Like a moth drawn to a flame, this algorithm zones in on the optimal solution, to minimise fuel cost as well as shrink emission of harmful gases. Moth Flame Optimization is a simple and robust method to discover the optimal solution in a vast search space. Thus, by implementing a heuristic algorithm like Moth Flame Optimization, the complex problem of finding the power to be generated by each generator in a power system can be vastly simplified and the optimal result can be easily and efficiently obtained.rnrn
    Keywords: Economic Dispatch; Emission Dispatch; Improved Moth Flame ; heuristic algorithm.

  • A hybrid algorithm to solve the stochastic flow shop scheduling problems with machine break down   Order a copy of this article
    by MARICHELVAM MARIAPPAN KADARKARAINADAR 
    Abstract: A flow shop scheduling problem with uncertain processing times and machine break down is considered in this paper. The objective is to minimize the maximum completion time (makespan). As the problem is NP-hard (Non-deterministic Polynomial-time hard), a hybrid algorithm (HA) is proposed to solve the problem. The firefly algorithm (FA) is hybridized with the variable neighbourhood search (VNS) algorithm in the proposed HA. Extensive computational experiments are carried out with random problem instances to validate the performance of the proposed algorithm.
    Keywords: scheduling; NP-hard; flow shop; makespan; firefly algorithm; variable neighbourhood search.

  • Critical review of literature and development of a framework for application of Artificial Intelligence in business   Order a copy of this article
    by Sanjay Mohapatra 
    Abstract: Artificial Intelligence has the ability to predict outcomes accurately and with reliability. The techniques have been used in several industries and domains. However, documenting results from different research that were conducted have not been documented. Also, most of the research have been carried out in developed countries and not much work have been published from other economies. As a result, there is a need to develop proper research background so that application of AIs can be sustainable and effective. The purpose of this study is to critically review different studies that have adopted AI in several domains, so that a theoretical framework guide for researchers and practitioners can be developed. This framework will also establish future trends in the said research area. From online databases, relevant articles and extracts were retrieved and were systematically analyzed. Using these inputs, a framework was developed. The findings of this study show that there is a gap between research work done and documentation available. The present applications of AI techniques require model based approach that brings in consistency in research as well as for industry. A paradigm shift in the framework based approach could lead to achieving a sustainable practice.
    Keywords: Artificial Intelligence; Framework; Theoretical Study; AI applications.

  • Labor productivity improvement using hybrid Maynard Operation Sequence Technique and Ergonomic assessment   Order a copy of this article
    by Medha LNU, Sharath Kumar Reddy, Vimal KEK, Aravind Raj Sakthivel, Jayakrishna Kandasamy 
    Abstract: Productivity measures how efficiently productions inputs, such as labor and capital, are being used in an economy to produce a given level of output. With growing competition across the globe, contemporary organizations are under pressure to exploit the untapped potential of the labour. Maynard Operation Sequence Technique (MOST) is a work measurement system that can be easily implemented and practically maintained. The basic ergonomic analysis was also conducted for understanding the interactions among the labour and other elements of a system to optimize human well-being and overall system performance. In this article, MOST is used for time measurement study and minimization of fatigue among the operators by using ergonomics in a stamping unit. The primary objective of this review was to reduce the motion of a task in order to reduce the effort and time to perform the task to achieve higher production and better service level by the ergonomic approach. Ergonomics accounts the user's capabilities and limitations to ensure that tasks, functions, information, and environment suit each operator in any organization. Scoring sheets approach was used in conducting ergonomics study to decide the fitness of any unit on the basis safety and posture analysis of the operator. The hybrid approach (MOST-Ergo) can be used to improve the productivity of any organization by reducing the time and fatigue consumed by the operator during the operation.
    Keywords: maynard operation sequence technique; MOST; time study; standard time; productivity; ergonomic work posture analysis.

  • An Efficient method for Adjustable Load Equalization for Reducing Traffic in Routing for Mobile Ad Hoc Networks   Order a copy of this article
    by T.C. Ezhil Selvan, P. Malathi 
    Abstract: The blockage is a crucial problem in mobile ad hoc network intruding the complete improvements of the network. The load balancing of the network with blockage is an additional vast difficulty over the mobile ad hoc networks and the reason for performing routing is based on the differences in the costs made for links. Several prevailing routing protocols provide a solution of loads or altering the blockages in terms of scalability. The intention is to design a traffic adjustable routing LETARP protocols. The shifting of the load from blockage nodes to slightly crucial nodes along with sharing of other nodes with the capabilities of influence within the route can improve the complete lifespan of the network. The proposed bi-directional standard for congestion density and link costs based on routing is used for varying the blockage situations. The routes linked with small congestion intensity and the normalization used for selecting the enhanced service life for the packet based transmissions. The performance of the networks based on the designed LETARP is evaluated based on the blockage adjustable routing protocols for the proportion of packet delivery, end to end delivery and routing.
    Keywords: Blockage; Congestion; Scalability; Mobile Ad Hoc Networks; Density.

  • Context-sensitive contrastive feature-based opinion summarization of online reviews   Order a copy of this article
    by Lavanya Sk, Parvatha Varthini 
    Abstract: Online reviews discuss one product at a time. It becomes a time consuming process for a user to compare and contrast two or more products. Contrastive Opinion Summarization was introduced to help users by producing a useful contrastive summary. A new COS approach called Context-Sensitive Contrastive Opinion Summarization is introduced in this paper. This method extracts feature and opinions of a product based on context term in a sentence and compare them with opinions of another product to develop a brief summary comprising of contrastive feature-opinion pair. Traditional COS methods rely on content similarity and contrastive similarity functions based on simple semantic term matching using WordNet. To perform in-depth semantic analysis, more advanced similarity functions are needed. A clustering algorithm is proposed which adds context- similarity into PLSA model in semantic term matching by incorporating context priors from Conceptnet. In addition the algorithm attempts to align contrastive feature-opinion pair in each cluster to generate a contrastive summary about different products. Compared to previous methods, the proposed model can integrate context sensitivity and better align the contrastive opinion pair for producing better arguments summary. Experimental were conducted over the benchmark UCI Machine Learning Repository Car Data Sets and the results exhibit the usefulness of the context- similarity over the baseline content similarity and contrastive similarity measures.
    Keywords: Feature based Opinion Summarization; Contrastive summary; Contrastiveness; Representativeness;Context-aware sentiment analysis.

  • Owner-managers perceptions of corporate social responsibility practices within small and medium-sized accounting firms - an Australian study   Order a copy of this article
    by Sujana Adapa, Josie Fisher 
    Abstract: This article explores conceptualisations of Corporate Social Responsibility (CSR); perceptions of its importance; and practices implemented by owner-managers of Small and Medium Sized Enterprises (SMEs) in Australia. Qualitative in-depth interview data was obtained from 17 owner-managers of small and medium-sized accounting firms operating in Sydney. Inductive content analysis was conducted by the researchers to identify the concepts and themes of importance by using Leximancer qualitative text analytical software. The results revealed that the owner-managers of these firms were aware of the basics of social responsibility and recognised that the adoption of responsible business practices contributes to business success. The owner-managers perceptions of the practices of CSR varied based on the firm size that resulted in the emergence of an additional category of family-owned firms. Additionally, a subset of small micro-sized firms also emerged on the basis of their CSR practices and unique orientations towards the concept of CSR as highlighted by the owner-managers of the accounting firms in Sydney. As outlined by the owner-managers of the small and medium-sized firms in the study, retaining market position seemed to be critical in the adoption of CSR practices in accounting firms. Thus, a majority of the small and medium-sized accounting firms included in the study sample seemed to adopt reactive CSR practices to retain their market position, although very few engaged in following proactive CSR practices.
    Keywords: corporate social responsibility; small and medium-sized firms; owner-managers; stakeholders; firm size; accounting.

  • MAXIMISING THE EFFICIENCY OF KEYWORD ANALYTICS FRAMEWORK IN WIRELESS MOBILE NETWORK MANAGEMENT   Order a copy of this article
    by Geetha Kannan, Kannan A 
    Abstract: Nowadays, data analytics in spatial database objects are associated with keywords. In the past decade, searching the keyword was a major focusing and active area to the researchers within the database server and information Retrieval community in various applications. In recent years, the maximizing the availability and ranking the most frequent keyword items evaluation in the spatial database are used to make the decision better. This motivates to carry out research towards of closest Keyword Cover Search. Which, is also known as Fine Tuned keyword cover search methodology, it considers both inter-object distance and keyword ranking of items in the spatial environment. Baseline algorithm derived in this area has its own drawbacks. While searching the keyword increases, the query result performance can be minimized gradually by generating the candidate keyword cover. To resolve this problem a new scalable methodology can be proposed in this paper.
    Keywords: Information retrieval; Keyword searching; Nearest Neighbor; Point of Interest; Spatial database.

  • IMPROVING MECHANICAL STRENGTH ON WELDED JOINTS BY USING OPTIMIZATION TECHNIQUE   Order a copy of this article
    by Pandiyarajan R 
    Abstract: The welding process is an important component in many industrial operations. Welding input parameters play a very significant role in determining the quality of a weld joint. The joint quality can be defined in terms of properties such as weld-bead geometry, mechanical properties, and distortion. Generally, all welding processes are used with the aim of obtaining a welded joint with the desired weld-bead parameters, excellent mechanical properties with minimum distortion. So that in this paper presents find the suitable input parameter in welded joints and then find the full penetration of the material. By using on design of experiment (DOE), Genetic Algorithm algorithms and computational network are widely used to develop a mathematical relationship between the welding process input parameters and the output variables of the weld joint in order to determine the welding input parameters that lead to the desired weld quality of the welded material.
    Keywords: Welding; DOE; Parameter Optimization; GA; Regression Equation.

  • A Study on the Impact of Psychological Empowerment on Motivation and Satisfaction among the Faculty Working in the Technical Educational Institutions in India based on Age and Work Experience difference   Order a copy of this article
    by Prabha Mariappan, Punniyamoorthy Murugesan, Nivethitha Santhanam 
    Abstract: The purpose of this study is to investigate how the impact of PE on motivation and satisfaction varies according to the faculty members age and work experience. Data were collected from 402 faculty members employed at technical institutions across India. From the results of the study, it is evident that faculty members with above average age exhibited higher PE, motivation, and satisfaction. Subsequently, faculty members with above average experience possessed a higher level of PE and satisfaction. Further, the implication of these findings suggests that the proposed framework will act as a benchmarking tool to measure the psychological empowerment among the faculty members.
    Keywords: Psychological Empowerment; Age; Work experience; Faculty.

  • A Study on the Impact of Macroeconomic Indicators on the Stock Price by relaxing the Assumptions of Stationary in Time series Data in a General Linear Model   Order a copy of this article
    by M.I. Nafeesathul Basariya, Punniyamoorthy Murugesan 
    Abstract: A model has been evolved by keeping the stock index as dependable variable and Gross Domestic product, Consumption and Consumer Price Index as independent variables. The assumptions to arrive the model are tested. The behavior of the model is studied by including and relaxing the important assumption of stationarity in the economic data. It was finally found that the model becomes significant if we violate the stationary assumption for both dependent variable stock price and independent variable Consumer Price index, Consumption and Gross Domestic Product. This is evidenced by demonstrating the model by using the data related to the macroeconomic variables of Developed countries USA, UK, Emerging countries India and Brazil, Frontier Countries Latvia and Estonia.
    Keywords: Consumer Price Index; Consumption; Gross Domestic Product; Stock Prices; Stationarity.

Special Issue on: BDSCC-2018 Big Data Innovation for Sustainable Intelligent Computing

  • Convergence of Partial Differential Equation Using Fuzzy Linear Parabolic Derivatives   Order a copy of this article
    by Shanthi Devi Palanisamy, Viswanathan Ramasamy 
    Abstract: Discovering solution for Partial Differential Equations (PDEs) is a very difficult. The exact solution can identify only in some particular cases. The numerical method for PDEs have been attains a greater significance growing in recent years. In this paper, we consider the convergence of Partial Differential Equation using Fuzzy Linear Parabolic (PDE-FLP) method on a finite domain. The method is based on the construction of PDE where the coefficients and initial conditions are obtained as fuzzy numbers and solved by linear parabolic derivatives. The linear parabolic derivative serves as the basis where Fuzzy form is considered to be solved as a numerical solution for PDEs. Firstly, the PDE form of two independent variables and the fuzzy representation of the two independent variables are derived. Secondly, Fuzzy Linear Parabolic derivative is provided for the convergence to a numerical solution for PDE. Fuzzy linear parabolic derivatives are employed to describe the wide variety of time dependent development. Parabolic derivatives are used in PDE-FLP because the coefficient is the same as the condition for the analytic solution. Finally, numerical results are given, which demonstrates the effectiveness and convergence of the PDE-FLP method. A detailed comparison between the approximate solutions obtained by these methods is discussed. Also, figurative representation to compare between the approximate solutions is also presented.
    Keywords: Partial Differential Equation; Fuzzy; Linear Parabolic; Domain; Numerical solution.

  • Document Similarity Approach using Grammatical Linkages with Graph Databases   Order a copy of this article
    by Priya V, Umamaheswari K 
    Abstract: Document similarity had become essential in many applications such as document retrieval, recommendation systems, plagiarism checker etc. Many similarity evaluation approaches rely on word based document representation, because it is very fast. But these approaches are not accurate when documents with different language and vocabulary are used. When graph representation is used for documents they use some relational knowledge which is not feasible in many applications because of expensive graph operations. In this work a novel approach for document similarity computation which utilizes verbal intent has been developed. This improves the similarity by increasing the number of linkages using verbs between two documents. Graph databases were used for faster performance. The performance of the system is evaluated using various metrics like cosine similarity, jaccard similarity and Dice with different review datasets. The verbal intent based approach has registered promising results based on the links between two documents.
    Keywords: Document similarity;Graph Database; Grammatical Linkages;knowledge graph; Text Similarity.

  • A CUSTOMER BASED SUPPLY CHAIN NETWORK DESIGN   Order a copy of this article
    by Anand.T Anand.T, Sudhakara Pandian R 
    Abstract: This study eventually synthesizes and proposes a new algorithm for a customer to a customer supply chain management system. Parallely we consider cost reductions in quantity rebate for inbound and outbound transportation of logistics. It utilizes an approximation procedure to simplify distance calculation details and builds up an algorithm to solve supply chain management issues using non- linear optimization technique. Numerical studies illustrate the solution procedure and influence of model parameters on Supply Chain Management and total costs. This study will result as a reference for top-level managements and organizations.
    Keywords: Customer to Customer Network design; facility location-allocation; inventory policy; continuous approximation approach.

  • Proficient smart trash can management using Internet of Things and SDN architecture approach   Order a copy of this article
    by Vairam Thangavel 
    Abstract: Most of the metropolitan cities facing the problem of collecting Garbages on time. Due to the inadequacy of garbage collection, the trash bin gets overflow and causes various risk such as spreading diseases, unpleasant aroma, ugliness etc To evade all these circumstances, this paper addresses the efficient collection of garbages by implementing IoT based smart trash can system. This system monitor the overall status of all trash cans around the city. Whenever the trash level of bin reaches the threshold level it sends the alert message to the truck driver. It also helps the garbage truck driver by providing him with the shortest path to attend all trash cans in city. IoT based smart trashcan is implemented using Raspberry Pi board with HC SR04 ultrasonic sensor for measuring trash level. Amazon Web Services (AWS) helps in storage of data and sending notifications to the concerned people who are involved in the process of collecting garbage. Managing data traffic in IoT network is difficult task, we also addressed this issue by designing the Software Defined Networking for Smart Trash Can system. SDN will further help to improve the performance of our system.
    Keywords: Software Defined Networking; Internet of Things (IoT); Trash Bin; Amazon Web Services.

  • Decision Tree classification N Tier Solution for preventing SQLInjection Attack on Websites   Order a copy of this article
    by Naveen Durai K, Baskaran K 
    Abstract: The current situation has dragged everyone into the contiguous usage of web applications. As every task is performed based on web applications, it is very important that we will have to think and secure the web applications to the most out of it. What is SQLIA? It could be defined as the one that is implemented by the fellow users who actually does not possess any of the access permissions though they want to abuse the access rights in the database and steal the data or edit them or delete as desired. To achieve any of this they will have to seek to the malicious query to leak out the data that is highly confidential. Interference of the SQL injection attack shall be well executed through the public interface as that is the existing source that an application provides when the case is that the host-level entry point and the network are secured enough. Some the suspicions that a SQLIA pretend to expose is that it cannot be applied without single quotes, space or double dashes. It gives little detection friendly option for the observers. A distinguished task here is for instance, when a relational approach based databases accepts the entry of injection query then the same will followed by all such relational approach based databases. The overall summary of this paper is: SQLIA, SQLIA techniques, previous models shortcomings are also produced, decision tree classification techniques utilization for preventing SQL injection vulnerability is proposed as well. Decision tree classification based attack signatures are also proposed in this model to filter the HTTP request that are sent as an input by the users. Whenever it is understood that the query yields unfair behavior then it is responsible to send an error message to the user and the request will be rejected concurrently. If it is a perfect then the scenario ends in processing and displaying the relevant web page. Nevertheless a comparison will be executed between the prevention techniques that still exist.
    Keywords: SQLIA;HTTP;OWASP;WEBSSARI;SAFELI.

  • P-Tree oriented Association Rule mining of Multiple data sources   Order a copy of this article
    by Subha Krishna 
    Abstract: A prominent research area in data mining field is Association Rule Mining(ARM). As distributed databases emerged, need to mine different patterns across them became necessary and hence distributed ARM algorithms were desired. But these algorithms increased communication complexity and overhead. P-tree oriented Distributed Association Rule Mining (PDAM) with message exchange optimization is proposed in this paper. The proposed technique provides faster support count computation. Both the database scans as well as message exchanges are reduced by the proposed method. It would also reduce the size of average transactions, data sets and message exchanges.
    Keywords: Association Rule Mining; P-Tree; Distributed Association Rule ; Mining.

  • Development of Manufacturing - Distribution Plan Considering Quality Cost   Order a copy of this article
    by Gokilakrishnan G, Ashokavarthanan P 
    Abstract: In the current complex business world, making decisions on the manufacturing-distribution problem is a tedious task to the supply chain managers. Solving mathematical model with many entities requires a suitable algorithm for optimum results which increase the profitability of any industrial activity. Any model without considering the percentage of rejection in a particular plant, will not supply the right quality and quantity of products to the customers. Here, a mathematical model is developed by considering the quality cost in addition to normal time manufacturing cost, subcontracting cost, transportation cost, overtime manufacturing cost, holding cost, cost of hiring, and cost of firing. Mixed Integer Linear Programming (MILP) model is developed and solved using a modified heuristic based Discrete Particle Swarm Algorithm (DPSA) which generates the manufacturing-distribution plan in order to bring the total cost minimum for the bearing industry under study. The normal time manufacturing loss and the overtime loss in terms of product quantity and cost are calculated and manufactured.
    Keywords: Quality cost; Optimization; Supply chain; Manufacturing-Distribution plan.

  • Chicken Swarm Optimization based Clustering of Biomedical Documents and Health Records to Improve Telemedicine Applications   Order a copy of this article
    by Sandhiya R, Sundarambal M 
    Abstract: The aim of this paper is to develop an efficient ontology enabled Chicken Swarm Optimization (CSO) based clustering algorithm with Dynamic Dimension Reduction (DDR) concept to efficiently cluster biomedical documents and health records to facilitate telemedicine applications. A total of 350 documents and health records are collected online mostly from the PubMed repository for telemedicine applications. First, the documents and health records are pre-processed via semantic annotation and concept mapping while the term frequency and inverse gravity moment (TF-IGM) factor is used to improve document representation and the modified n-gram resolves the substitution and deletion malpractices. Then the DDR technique reduces feature space dimension and prunes non-useful text features to increase the clustering accuracy by tackling the high dimensionality problem. Finally, the clusters are formed by the optimization clustering concept of CSO. The CSO-DDR clustering model is significantly efficient than the traditional clustering algorithms which is evident through the experimental simulations and it enables reliable and adaptive telemedicine applications with better clustering of biomedical documents and health records.
    Keywords: Tele-medicine; health records; biomedical document clustering; semantic smoothing; TF-IGM; Chicken Swarm Optimization; Dimension Reduction.

  • Inclusive Strategic Techno-Economic Framework to Incorporate Essential Aspects of Web Mining for the Perspective of Business Success   Order a copy of this article
    by Damodharan Palaniappan, Ravichandran C.S 
    Abstract: Web mining as a whole has already became a new field of study due to its vast capabilities and inventive possibilities. This attracts more researchers to explore new possibilities. The involvement of business aspect in web mining is ignored and it creates a huge divide between academic research and business applications. However to draw attention to the reality a taxonomy for various successful Web Mining techniques that have been used by the firms for online business has been tabulated. The tabulation shows that the web mining industries tries to integrate various types of web mining in such a way that it challenges the existing categorization of web mining. The competing nature among the web mining industries is observed to be its capability to drive business success. Therefore the classification of web mining using online business reliance as a factor has been considered. According to the classification, if the net effect of every click stream from a potential customer during an online session is expected to culminate in the 'buy' then it is exhaustive promote. Otherwise it is partial promote. Both the categories are then modelled as Cournot Competition suitable enough to evaluate web mining as a set of strategy to achieve business targets. Moreover intention behind modelling partial promote and exhaustive promote using Cournot Game theory is to have a techno-economic framework which helps in mapping web mining uncertainties with business performance. The results show that the developed techno-economic web mining framework performs mining operations from the perspective of business success. Hence it can help the management professionals in making appropriate choice while choosing, fine tuning, upgrading the web mining techniques.
    Keywords: Web Mining; E- Commerce; Web Content Mining; Web Structure Mining; Decision Making; Knowledge Management.

  • Effective transmission of critical parameters in heterogeneous wireless body area sensor networks   Order a copy of this article
    by V. Navya, P. Deepalakshmi 
    Abstract: Wireless body area networks have great potential to change the future of remote and personalized healthcare technology by embedding smart devices to provide real-time feedback. In this article, proposed a threshold-based routing concept to route only the critical data of bio-sensors during an emergency condition of a patient. Sensor nodes attached to the body, sense and forwards patients vital signs data based on the standard thresholds applied during the routing process. Depending on variations in the sensed data, the energy parameters are calculated and data are routed to the coordinator node for further communication. An efficient node is selected based on the least cost value that depends on high residual energy and less distance to sink. From the results obtained, the proposed technique provides improvements in terms of energy, stability period, network lifetime, throughput, path loss, and packet delivery ratio compared to existing multi-hop routing techniques.
    Keywords: wireless body area network; WBAN; real-time feedback; threshold-based routing technique; remote and personalized healthcare; an emergency condition of a patient; smart devices; critical data; standard thresholds applied; energy parameters; multi-hop routing.

  • Extreme Learning Machine and K-Means clustering for the improvement of link prediction in social networks using Analytic Hierarchy Process   Order a copy of this article
    by Gowri Thangam Jeyaraj, Sankar A 
    Abstract: Social Network Analysis (SNA) is the representation and analysis of social networks with people as nodes and relationships between them as a link in a graph. Link Prediction is one of the important tasks in SNA. It is relevant to healthcare, politics, national security, criminal investigations, recommender systems, DNA sequence prediction. The rapid growth of the availability of healthcare related data raises a challenge of extracting useful information from these data. Thus there is an urgent need for the healthcare industry in the development of efficient computational analysis tool to predict the disease. It reduces the amount of cumbersome tests on patients to predict the possibility of the disease. The link prediction problem can be viewed as a classification problem. The aim of this paper is to employ a combination of machine learning algorithms for the prediction of disease in a patient through the extraction of different patterns from the dataset based on the relationships that exists among the attributes. For this purpose the techniques namely the Extreme learning machine algorithm with k-means clustering and analytic hierarchy process were employed. The outcome of these techniques would help the physician and the medical scientists to predict the possibility of the disease. In todays era, the percentage of females getting affected by diabetes have increased exponentially. So, the experiments are carried over PIMA diabetes data set that focusses on females are extracted from UCI repository and the results are found to be significant.
    Keywords: Analytic Hierarchy Process; Extreme Learning Machine; K-Means clustering; Social Networks; Link Prediction.

  • Smart city video surveillance Using Fog Computing   Order a copy of this article
    by Prakash Ppp, Ragavi Suresh 
    Abstract: Conventional video surveillance systems require infrastructure including expensive servers with capability to process images and store video recordings .These surveillance systems produce and need to store a huge amount of data and to execute in real time to detect safety events. The problems of the anti-social activities which gradually increasing across the country especially in the urban areas in recent times which lead to the need for technological innovations in the security and surveillance system The proposed system is based on cloud computing .In this paper the application has been modelled and simulated using iFogSim The results predicts that the fog based model is more secured and efficient compared to cloud computing parameter energy consumption .The proposed system helps to increase the effectiveness of the intelligent agencies and thereby increase crime safety at public places.
    Keywords: Internet of things; Fog computing; Cloud computing;Video Surveillance;Smart City.

Special Issue on: Applications and Innovations of Enterprise Networking Management in Social Commerce

  • NEURO FUZZY COGNITIVE CONNECTION FUNCTIONAL POINT FOR ENTERPRISE NETWORK MANAGEMENT   Order a copy of this article
    by Frankvijay J. 
    Abstract: Software Effort Estimation (SEE) is one of the vital role in enterprise managment because it helps to predict amount of effort provided to particular process. This created SEE process used to minimize the incomplete data involvements, uncertainty and other mis-behaviors effectively. The SEE process consumes project plans, pricing information, investment details, budget and iteration plan as input and the SEE produces the exact effort for the particular enterprise details. During the process, the SEE may require large amount of data, which increases the efficiency of the effort estimation system. So, in this paper introduces the enterprise data analytics process by using the size and judgmental software effort estimation process. This method analyzes the effort in terms of using experts opinions, use case, functional points, and software size unit information. In addition to this, the method evaluates the neuro fuzzy cognitive connection based functional points are used to estimates the effort with effective manner. This method examines the connectivity between one requirement to another requirements and it constructs the relationship graph that eliminates the incomplete requirements and details successfully. The connectivity based functional point analyze process reduces the time for examining the software effort. Then the performance of the system is calculated with the support of experimental results such as MRE and VAF.
    Keywords: Software Effort Estimation; size and judgmental software effort estimation process; enterprise in software; neuro fuzzy cognitive connection based functional points; MMRE and VAF.
    DOI: 10.1504/IJENM.2020.10021544
     
  • Study and prioritizing factors of Productivity of the Employees of Steel Manufacturing Industry, Kanjikode by extended ACHIEVE Model   Order a copy of this article
    by Vinu V G., Oliver Bright A. 
    Abstract: Employee productivity is a key factor for the success of manufacturing companies. Employees are an asset that cannot be imitated by other resources, and unfortunately, they are also the hardest to control. Performance improvement initiatives with a wide range of approaches are used in an attempt to improve employee productivity. Incentives or organizational climate improvement are some examples of such programs. However, most of these studies take only one or two factor into consideration, which may not provide a comprehensive solution to the productivity problem they face in many cases. An extended ACHIEVE model by the name MACHIEVE model has been proposed to overcome this, with additional factor M - Material. Analysis of the components of labor productivity based on this new MACHIEVE model has been performed among employees in the Steel manufacturing Industry in Kanjikode. This is a Structural Equation Modelling analysis in which the sample consisting of 420 working personnel in a different workgroup in Steel manufacturing Industry in Kanjikode were selected from 1280 employees through stratified random sampling. The survey tool included labor productivity questionnaire of MACHIEVE model. The data were analyzed by AMOS-23 software and the mean scores for MACHIEVE model variables are calculated. The results indicated that the eight factors of MACHIEVE model have an impact on increasing employee productivity. The analysis also suggested that the two factors C-Clarity and H-Help had the greatest impact on labor productivity in the viewpoint of the staff.
    Keywords: ACHIEVE; MACHIEVE; Productivity; Employees; Steel Manufacturing.
    DOI: 10.1504/IJENM.2020.10021930
     
  • CAMELS MODEL ANALYSIS FOR DISTRICT CENTRAL CO-OPERATIVE BANKING ENTERPRISES IN ANDRA PRADESH   Order a copy of this article
    by SATHYA V., Oliver Bright A. 
    Abstract: CAMELS is a perceived worldwide rating framework to evaluate the relative money related quality of banking enterprises and to propose essential procedures to enhance shortcomings of banking enterprises. In Andhra Pradesh before detachment of Telangana State, there were 22 DCCBs (District Co-operative banking enterprises) in Andhra Pradesh State Cooperative Bank. For breaking down similar execution of the DCCBs in Andhra Pradesh, CAMELS Model has been utilized for the (CAGR compound yearly development rate) of 12 years (2002-2003 to 2013-2014) and from that point, thorough rank test and factual measures have been utilized. CAMELS remain for Capital Adequacy, Asset Quality, Management Efficiency, Earnings Capacity, Liquidity, and Sensitivity. CAMELS' proportions have the imperative imperativeness to feature the sound money related position and additionally the wellbeing of the DCCBs of the co-agent DCCB through smaller scale investigation of an asset report and pay explanation things.
    Keywords: CAMELS; Andra Pradesh; Cooperative bank; Micro Analysis.

Special Issue on: Sustainable Computing for Enterprise Resource Planning Applications

  • MINING MASSIVE ONLINE LOCATION BASED SERVICES FROM USER ACTIVITY USING BEST FIRST GRADIENT BOOSTED DISTRIBUTED DECISION TREE   Order a copy of this article
    by Venkatesh M., Mohanraj V, Suresh Y 
    Abstract: User activity is predicted through the frequency in which the online substances in Location-Based Social Networks (LBSN) are produced and used by the consumer. Users are classified by researchers into a number of groups depending upon the level of their functioning. This takes place by creation of services based on their location, so that maximum number of user gets benefited. This work involves Gradient Boosted Distributed Decision Tree (GBDT) which is optimized on the basis of total iterations and shrinkage on using best algorithm. Implementation of the data is done through Hadoop network. A foursquare dataset is created using work, food, travel, park and shop. One of the most commonly used machine learning algorithm is Stochastic Gradient Boosted Decision Trees (GBDT) at present. Its advantages are that it can be interpreted easily with increased precision. However, most of the implementations are costly computationally and needs all training data in the main memory. The node with lowest lower bound is developed through Best First Search (BFS). Its own filing system is provided through Hadoop which is called Hadoop Distributed File System. The algorithm used is K-Nearest Neighbor (KNN) classifier algorithm.
    Keywords: User activity; foursquare dataset; Stochastic Gradient Boosted Decision Trees (GBDT); Best-First Search (BFS) and K-Nearest Neighbor (KNN) classifier.

  • GRO AND WeGO - ALGORITHMIC APPROACHES TO INTEGRATE THE HETROGENOUS DATABASES AND ENHANCE THE EVALUATION OF ONTOLOGY MAPPING SYSTEMS IN THE SEMANTIC WEB   Order a copy of this article
    by Rajeswari Velliappan, Kavitha M, Dharmistan K.Varughese 
    Abstract: In the present day world, where information driven economy and information enhanced living standards rule everything, the sources of data from which the information is derived, are highly heterogeneous. The heterogeneity necessitates a mechanism for integrating data, existing in a variety of forms, before it is presented to the user in a fruitful manner. Different strategies have been developed with a number of implementations available to help the world population benefit from the ocean of data available across sources through massive network of computers. The Internet and World Wide Web, forming the backbone of the information highway will benefit from research solutions that enable people to retrieve data or information that fit their specific queries or requirements. Semantic web is an initiative in achieving that goal of machine processed information being available to us than requiring human intelligence for processing information. This work is carried out to address the heterogeneity problem that exists among data sources and provides a solution through the application of ontology. The ontology by itself is a structured data representation and intended for information processing through machine intelligence. Artificial intelligence is a thrust area for ontology applications. Ontology is a conceptual tool for handling semantic heterogeneity. The algorithmic approach adopted in the mapping solution system, considers the most common structure of ontology representation viz. the graph model. The graph nodes or the elements of ontology are compared carefully by a set of nine matching parameters to obtain various indices or scores as explained subsequently. Then a comprehensive similarity analysis is carried out to arrive at the degree of matching of individual nodes as well as the ontology in totality for an ontology alignment.
    Keywords: Ontology; Semantic Web; Heterogeneous databases; GRO and WeGO.

  • Feature Selection and Instance Selection using Cuttlefish Optimization Algorithm through Tabu Search   Order a copy of this article
    by Karunakaran Velswamy, Suganthi Muthusamy, Rajasekar Velswamy 
    Abstract: Over the recent decades, the amount of data generated has been growing exponentially, the existing machine learning algorithms are not feasible for processing of such huge amount of data. To solve these issues the two commonly adopted schemes are, scaling up the data mining algorithms and another one is data reduction. Scaling up the data mining algorithms is not a best way, but data reduction is fairly possible. In this paper, cuttlefish optimization algorithm along with tabu search approach is used for data reduction. Dataset can be reduced, in two ways, one is the selecting optimal subset of features from the original dataset i.e., eliminating those features which are contributing lesser information another method is selecting optimal subset of instances from the original data set, i.e. is eliminating those instances which are contributing lesser information. Cuttlefish optimization algorithm with tabu search finds both optimal subset of features and instances. Optimal subset of feature and subset of instances obtained from the cuttlefish algorithm with tabu search provides a similar detection rate, high accuracy rate, lesser false positive rate and the lesser computational time for training the classifier that we obtained from the original data set.
    Keywords: Data Reduction; Instance Selection; Feature Selection; Cuttlefish Optimization; Tabu Search; Machine learning; Artificial Intelligence.

  • Enterprise Bigdata Analysis using SVM Classifier and Lexicon Dictionary   Order a copy of this article
    by Radha Subramani 
    Abstract: The emergence of the digital era has led to growth in various types of data in a cloud. In fact, There may be three fourth of the total data will be treated as big data. In many organizations, massive volume of both structured and unstructured data sit idle. Various categories of data are complex for preprocessing, analyzing, storing and visualizing. Cloud computing provides suitable platform for big data analytics for the storage and for predicting customer behavior to sell products. Unstructured data like emails, notes, messages, documents, notifications and Twitter comments (including from IoT devices) remains untapped and is not stored in a relational database. Valuable information on pricing, customer behavior, and competitors may be inhumed within unstructured data. This makes cloud-based analytics as an effective research field to address several issues and risks need to be reduced. So we propose a method to extract and cluster sentiment information from various types of unstructured text data from social networks by using SVM classifiers combined with Lexicons and Machine Learning for sentiment analysis of customer behaviour feedback. The method has performed efficient data collection, data loading and efficiently performs sentiment analysis on deep and hidden web.
    Keywords: Deep web Mining;Sentiment analysis;Big data;Unstructured Data;Map Reduce; Hidden Information;Big Data Analytics;Text Mining;Clusters.

  • AN OPTIMIZED NEURAL NETWORK BASED SPECTRUM PREDICTION SCHEME FOR COGNITIVE RADIO   Order a copy of this article
    by BHUVANESWARI BALUSAMY, Meeradevi T 
    Abstract: A Cognitive Radio (CR) technology will be able to enable all the users that were earlier unauthorized to make use of this same spectrum where there is no interference. There will also be a spectrum sensing for all the non-authorized users to be able to perceive the other possibilities of getting a channel which will be a very important tool. However, this type of a perceived need for consuming such a very large spectrum of energy where this part has been reduced will perceive this channel to be one that is idle that cannot reduce the due consumption of energy but also that of the efficiency of its spectrum that may be further increased. As these characteristics of traffic for the various licensed user systems that are encountered also in real life will not be known to be a priori to design the spectrum predictor by means of making use of the Back Propagation (the BP) Neural Network (the NN) model and the Multi-Layer Perceptron (the MLP) that does not need any knowledge of these characteristics of the user systems. The BP is that algorithm which will not need any prior knowledge of the real world problems that are trapped within the local minima. This is used widely to solve the problems and this is found in literature as an evolutionary algorithm like that of the Bacterial Foraging Optimization Algorithm (BFOA) that will be used for the MLP NN for enhancing the process of learning and for improving the rate of convergence as well as accuracy of classification. Performing this spectrum predictor will be analysed using some extensive simulations.
    Keywords: Spectrum Prediction; Cognitive Radio (CR); Neural Networks (NN); Multi-Layer Perceptron (MLP); Back Propagation (BP); Bacterial Foraging Optimization Algorithm (BFOA).

  • AN IMPROVED DOWNLINK PACKET SCHEDULING ALGORITHM FOR DELAY SENSITIVE DEVICES IN BOTH H2H AND M2M COMMUNICATIONS IN LTE-ADVANCED NETWORKS   Order a copy of this article
    by Radhakrishnan S., Neduncheliyan S, Thyagharajan Kk 
    Abstract: The demand for increased data rate with improved QoS for real time data traffic is ever increasing in the present day wireless environment. To meet this demand, the 3GPP has proposed the LTE-Advanced, the 4.5G technology, which is going to deliver immense benefits to the wireless network world. Very few scheduling schemes in the literature consider the gain of weighted transmission rates by the channel for transmitting data. Even these schemes, incur lot of scheduling overhead at the eNodeB(Base Station). Therefore, this work recommends an energy efficient, QoS-aware scheduler with reduced scheduling complexity at the eNodeB, for transmission of delay sensitive data in downlink LTE-Advanced networks. A basic LTE-Advanced scheduler has been designed by incorporating Carrier Aggregation with Multi Input and Multi Output (MIMO) features to provide the required bandwidth. The Scheduling problem is composed as a gain of weighted transmission rates of all possible combinations of various resources required by the channel for transmitting data. An improved greedy algorithm with reduced scheduling complexity at the eNodeB, has been developed to allocate the resources dynamically to the User Equipments(UEs) for the transmission of real time data. The UEs can be either a H2H or M2M device.The video frames which are given as input to this algorithm are compressed using Discrete Wavelet Transform to facilitate faster transmission of data. The results of this research work show that the proposed scheduling algorithm is energy efficient, QoS-aware and greatly improves the coverage of the cell edge users. The performance of this greedy-based scheduler is compared with other two notable schedulers in the literature namely LOG rule and EXP-rule. This scheduling algorithm outperforms the other schemes in terms of throughput, fairness and spectral efficiency for real-time data transmission by the eNodeB in LTE-Advanced networks.
    Keywords: LTE-Advanced; Greedy Algorithm; Downlink Packet Scheduling; CA,MIMO,M2M,QoS.

Special Issue on: Cloud Computing in Enterprise Network Management

  • An Attempt to Enhance the Time of Reply for Web Service Composition with QoS   Order a copy of this article
    by Karthikeyan Sivasamy, Meenakshi Devi P 
    Abstract: The web services are the commonly prevailing service clusters of the service oriented framework (SOA) and service related assessments. The disputes are related to the quality of service (QoS) for choosing web services freely and creating a collection of web services for carrying out trades. The ultimate aim is to choose web services based on the non functional features and quality of service (QoS) ranks. The aim is to consider that these web services holds identical features for every processes along which they hold varying non functional features and quality of service (QoS) metrics. In order to choose a web service for every process a social aspect web (SAW) scheme is employed which does not comprehensively make use of all sorts of web services. It employs the requirements of the user for ranking web service set of the applications and finally provides SAW schemes over a set of web service applicants. The mechanism helps in selecting the web services in terms of quality of services (QoS) scores and user needs. The choice of web services over varied web services based on scheme can be utilized for aggregation and organizing web services resulting in optimized time of reply to the web service actions.
    Keywords: QoS; SAW; Ranks; Web Services and Non – Functional Features.

  • Attempting to Design Differed Service Broker Forwarding Strategy for Data Centres in Cloud Environment   Order a copy of this article
    by S. Prabu, S. Karthik 
    Abstract: The cloud computing is based on broadcasted computing resources for controlling diverse services like a server, storage and applications. The applications and models are offered in terms of pay per usage using the data centre to the users. The data centres are positioned globally and moreover, these data centres could be overloaded with the escalated number of client applications that are being serviced at the identical time and position which corrupts the comprehensive quality of service of the relayed services. Diverse user applications might need diverse customization and demands calibrating the performance of the user applications at differed resources are quite intricate. The service supplier is incapable of performing choices for the suitable set of resources. The design of differed service broker forwarding strategies is based on heuristics intended to accomplish minimal reply time based on the transmission medium, bandwidth, latencies and task size. The designed service broker strategy attempts in minimizing the overloads of the data centres by conveying the user demand to the subsequent data centres that acquire improved reply and operational time. The analysis reveals potential outcomes in terms of reply and operational time as estimated to the other exiting broker strategies.
    Keywords: Service Broker; Cloud Environment; Quality of Service; Data Centres and latency.