Forthcoming articles


International Journal of Information and Communication Technology


These articles have been peer-reviewed and accepted for publication in IJICT, but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.


Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.


Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.


Articles marked with this Open Access icon are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.


Register for our alerting service, which notifies you by email when new issues of IJICT are published online.


We also offer RSS feeds which provide timely updates of tables of contents, newly published articles and calls for papers.


International Journal of Information and Communication Technology (82 papers in press)


Regular Issues


  • Research on eye localization method based on adaptive correlation filter   Order a copy of this article
    by Jin Wu 
    Abstract: Eye location is one of the most important steps in the face recognition and visual tracking system. This article combines the correlation filter with integral projection to detect eye position precisely, and puts forward two improvements in the training and test phases. Firstly, the adaptive synthetic correlation filter are rotated within the angle range of -0.2,-0.1,0,0.1,0.2, and the corresponding location of the maximum grey value was served as an initial anchor point. Secondly, within the anchor point 5
    Keywords: Eye Localization; Correlation Filter; Grey Integral Projection; Adaptive.

  • A new approach to Supporting Runtime Decision Making in mobile OLAP   Order a copy of this article
    by Djenni.Rezoug Nachida 
    Abstract: Mobile OLAP (On Line Analyses Processing) system offers to decision makers the real-time and relevant analyses anywhere and at anytime. In order, to generate them, a mobile OLAP should not only use user preferences, but also exploits information about contextual situation (meeting, business travel, office work, or home work) where analyses are done. For instance, when generating analyses, a mobile OLAP could take into account whether the decision maker's contextual situation is a business travel (uses a device with limited resources) or an office work (uses a device with high capacities). For this end, we investigate in this paper to propose a mobile context-aware recommender system (MCARS for short) based on both user preference and context. But, unfortunately, the limited resources in the MCARS make reducing a context acquisition a necessary need. To achieve this goal, our system proposes: (i) a learned approach which generates relevant contextual factors (contextual factors shown to be important); (ii) deduces a relationship between a context and user's preferences (called contextual preference s) and finally (iii) recommends a set of analysis based on user's contextual preferences
    Keywords: relevant Context; Context-aware recommender system; knowledge-based recommender system; K2.

  • Fruit fly image segmentation and species determination algorithm   Order a copy of this article
    by Pei XU 
    Abstract: For large-scale, real-time and accurate monitoring of citrus orchard fruit flies, an algorithm was developed to identify three kinds of mature fruit flies as Bactrocera dorsalis, B. cucurbitae and B. tau (Dipetra: Tephritidae) based on machine vision technology. A fruit fly sample image library was produced and the body characteristics for distinguishing the three different kinds of flies were analyzed. Within the identification algorithm, image target and background region segmentation was conducted under the YCbCr color space and long-axis searching of the scutellum area. Image registration was achieved using Hough transform, and the BP neural network fruit fly identification model was established to distinguish the fruit fly in each image. Every algorithm in this study was programmed on Matlab software. The image registration algorithm was applied to 120 images, each of which contains a single specimen of Bactrocera dorsalis, B. cucurbitae or B. tau (Dipetra: Tephritidae) from the self-captured sample library. The BP neural network model was applied to the identification of the fruit flies. Experimental results indicated that: (1) the yellow scutellum at the waist and abdomen of three different kinds of fruit fly contained the largest yellow area within the body, thus it could be used as the long-axis searching area during image registration. The vertical yellow lines in the middle of the back torso of both B. cucurbitae and B. tau (Dipetra: Tephritidae) could be identified to distinguish these two types of fruit fly from the Bactrocera dorsalis. Further, the area rate of these lines to the whole body could be used to distinguish among the three types of fruit fly. (2) The image registration accuracy was 100% with an average registration time consumption of 0.4 seconds. (3) The BP neural network model accuracy was 100%, indicating satisfactory identification. Further research will focus on the improvement of the self-adaptive aspect of the threshold selection, since the threshold selection for target region segmentation was influenced by the experimental environment.
    Keywords: machine vision; fruit fly; citrus orchard; Hough transform; precision agriculture.

  • Boosting Prediction Performance on Imbalanced Dataset   Order a copy of this article
    by Masoumeh Zareapoor, Pourya Shamsolmoali 
    Abstract: Mining from imbalance data is an important problem in algorithmic and performance evaluation. When a dataset is imbalanced, the classification technique is not equally considering both the classes. It is obvious that the standard classifiers are not suitable to deal with imbalanced data, since they be likely to classify all the instances into the Majority class, which is less important class. Additionally some of the performance measurement, like accuracy- which is known to be bias metrics in the case of imbalance data-does not have a very good performance when the data is imbalance. In this paper we tried to apply various techniques that used commonly to handle class imbalance, before giving the data to the classifiers. But, the performance of the classifiers is found degrading because of the highly imbalanced nature of the datasets. Hence we propose an integrated sampling technique with an ensemble of AdaBoost to improve the prediction performance. Meanwhile, through empirical, we show the more appropriate performance measures for mining imbalanced datasets.
    Keywords: Imbalanced dataset; classification; re-sampling; ensemble.

  • Convergence Analysis of Adaptive MSFs used for Acoustic Echo Cancellation   Order a copy of this article
    by Alaka Barik, Mihir N. Mohanty, Kunal Das 
    Abstract: Through the technology advancement grows day by day, new challenges raise accordingly. In modern era of communication echo cancellation is a major problem. Cancellation of acoustic echo from loudspeaker and microphone coupling is an essential as well as challenging task. Mostly the adaptive filters are used to cancel the echo in present scenario and many researchers are still working on this area. The LMS algorithm design is used extensively in communication networks to correct for the echoes created by line impedance mismatches and is useful to compensate for the imperfection in telephony networks. This paper shows how the LMS algorithm through Multiple Sub Filter (MSF) is useful to solve echo problems. A large amount of coefficients is required for acoustic echo cancellation for a long path. Similarly lengthy filter results slow convergence.This paper comprises this tradeoff using LMS algorithm for the echo cancellation purpose. The proposed algorithm is based upon decomposing a long adaptive filter into smaller sub filters. Different types of errors have been analyzed that is mean error , common error and combination of both. Also, the comparative analysis among Mean Error and common error analysis shows its performance in terms of convergence. Simulations results show that the decomposed algorithm shows better than the long adaptive filter.
    Keywords: Echo Cancellation; Adaptive Algorithm; Convergence; Mean Error; Common Error; Composite Error.

  • Probability Least Squares Support Vector Machine with L1 Norm for Remote Sensing Image Retrieval   Order a copy of this article
    by Jinhua Zhu 
    Abstract: This paper proposes a probability least squares support vector machine (PLSSVM) classification method that aims at remote sensing image data, like high-dimension, nonlinearity, and massive unlabeled samples. Hybrid entropy was designed by combining quasi-entropy with entropy difference, which was used to select the most valuable samples to be labeled from a larger set of samples. An L1 norm distance measuring was then used to further select and remove outliers and redundant data. Finally, based on the originally labeled samples and the screened samples, the PLSSVM was gained through training. The experimental results of the classification of ROSIS hyperspectral remote sensing images show that the overall accuracy and Kappa coefficient of the proposed classification method are more accurate than existing methods. The proposed method obtains higher classification accuracy with fewer training samples, which makes it very applicable to current problems of classification.
    Keywords: remote sensing image; hybrid entropy; L1 norm; active learning; PLSSVM (probability least squares support vector machine).

  • Application of Soft Computing Neural Network Tools to Line Congestion Study of Electrical Power Systems   Order a copy of this article
    by Prasanta Kumar Satpathy 
    Abstract: This paper presents a scheme for application of soft computing neural network tools namely Feed Forward neural network with backpropagation, and Radial Basis Function neural network for the study of transmission line congestion in electrical power systems. The authors performed sequential training of the two proposed neural networks for monitoring the level of line congestion in the system. Finally, a comparative analysis is drawn between the two neural networks and it is observed that Radial Basis Function neural network yields fastest convergence. The proposed method is tested on the IEEE 30-bus test system subject to various operating conditions.
    Keywords: line congestion index; neural network; hidden layer; training performance.

  • Development and implementation of a wireless sensor system for landslide monitoring application in Vietnam   Order a copy of this article
    by Dinh-Chinh Nguyen, Duc-Tan Tran 
    Abstract: The effect of climate change and human activities leads to a series of dangerous phenomena, such as landslides, flood, etc. Therefore, it is necessary to build a system to monitor environmental hazards. There are some studies that built landslide monitoring systems based on wireless sensor network (WSN). However, there is not any WSN that is the best standard for landslide monitoring system. The energy saving which helps extend the lifetime of the landslide monitoring system is very important. This paper focuses on the development of a WSN system with the proposed energy efficient scheme. In this work, we build a complete system that consists of sensor nodes, a gateway node, a database, a website interface and an Android application for landslide monitoring application. The energy efficiency scheme is applied at sensor nodes to increase the lifetime of sensor node up to 154 times and significantly increase the rate of successfully received packets. The system is also assembled and measured in the lab and the outdoor to analyse initial results.
    Keywords: landslide; energy; wireless sensor network; WSN; power consumption; Vietnam.
    DOI: 10.1504/IJICT.2018.10010632
  • A 24 GHz Dual FMCW Radar to Improve Target Detection for Automotive Radar Applications   Order a copy of this article
    by Quang Nguyen, MyoungYeol Park, YoungSu Kim, Franklin Bien 
    Abstract: A 24 GHz automotive radar with Dual Frequency Modulated Continuous Waveform (FMCW) is proposed. By using this modulation waveform, the ghost targets can be avoided, especially in multi-target situations. Thus, the detection ability of radar systems can be significantly improved. In this paper, in order to generate the Dual FMCW signal, a Dual FMCW Modulation Control Logic (DFMCL) is proposed. This block incorporates a 24 GHz Fractional-N Phase Locked Loop (PLL), synthesize 24 GHz modulation waveform. The proposed architecture was designed using 130 nm CMOS process. Two alternative chirps of 12 ms and 6 ms were generated in the coherent processing interval. The modulation bandwidth was 200 MHz. Moreover, a radar transceiver, which consists of 24 GHz Dual FMCW generator and Monolithic Microwave Integrated Circuits (MMICs), was accomplished. A behavioural simulation was conducted to evaluate the operation of the proposed generator. Then, the transceiver was modelled for testing the detection ability of the automotive radar. The results demonstrated that the proposed scheme has the ability to realize the Dual FMCW waveform for automotive radar systems. Furthermore, the system can avoid the ghost targets in multi-target scenarios.
    Keywords: automotive radar; 24 GHz; FMCW; transceiver; multi-target.

  • Converged Services Composition with Case-Based Reasoning   Order a copy of this article
    by Hui Na Chua, S.M.F.D Syed Mustapha 
    Abstract: In order to achieve the converged service composition in Next Generation Network environment, it is necessary to have an approach that is capable to manage the difficulties of potential complexities due to service unavailability and network failures. In response to these challenges, we propose a converged service composition (CSC) framework having a management function that uses case-based reasoning (CBR) for handling services unavailability and/or network failures during the service composition process.
    Keywords: web service composition; case-based reasoning; next generation network service layer

  • Optimised Cost Considering Huffman Code for Biological Data Compression   Order a copy of this article
    by Youcef GHERAIBIA, Sohag Kabir, Abdelouahab Moussaoui, Smaine Mazouzi 
    Abstract: Classical Huffman code has been widely used to compress biological datasets. Though a considerable reduction of size of data can be obtained by classical Huffman code, a more efficient encoding is possible by treating binary bits differently considering requirement of transmission time, energy consumption, and similar. A number of techniques have already modified the Huffman code algorithm to obtain optimal prefix-codes for unequal letter costs in order to reduce overall transmission cost (time). In this paper, we propose a new approach to improve compression performance of one such extension, the cost considering approach (CCA), by applying a genetic algorithm for optimal allocation of the codewords to the symbols. The idea of the proposed approach is to sacrifice some cost to minimise the total number of bits, hence, the genetic algorithm works by giving penalty on the cost. The performance of the approach is evaluated by using it to compress some standard biological datasets. The experiments show that the proposed approach improves the compression performance of the CCA considerably without increasing the cost significantly.
    Keywords: Data Compression; Huffman Code; Information Coding; Genetic Algorithm; Cost Considering Approach; Optimization.

  • Sybil Attack Resistant Location Privacy in VANET   Order a copy of this article
    by Balaram Allam, Pushpa S 
    Abstract: Vehicular Ad Hoc Networks (VANETs) are more susceptible to a large number of attacks due to its open medium and anomalous nature. Location privacy is an imperative challenge in VANET, an attacker can easily trace the vehicle activities, if it has the knowledge of the location. In the most of the location privacy preserving mechanisms, each RSU provides secret information to a vehicle entered into the range, and it is known only to the corresponding RSU. However, these are vulnerable to Sybil attack, whereby a malicious vehicle compromises an RSU, and it pretends as multiple vehicles. Thus, an effective mechanism is in need to identify the attacker and compromised RSU in VANET. This paper proposes a Sybil Attack Resistant Location Privacy (SARLP) system to identify the attacker even a compromised RSU present in the VANET. The SARLP system employs a Location Privacy Unit (LPU) to provide an effective authentication mechanism. It hides the real identity of a vehicle by providing temporary key and a trusted certificate to the user, and thus improving the location privacy. The RSU imposes its signature on a generated secret random number of the vehicle entered into the region. Each vehicle and RSU maintain the random number secretly. If two vehicles communicate, sender reveals the sequence of secret random numbers received from different RSUs along its traveling path. The secret maintenance mechanism verifies the secret random number at the interference range of two RSUs, and thus a genuine RSU can successfully detect the compromised RSU. The simulation results show that the SARLP system achieves a higher level of location privacy preservation of vehicles and attack resilience in networks compared to the existing footprint scheme.
    Keywords: VANET; Location privacy; Compromised RSU; Sybil attack; Authentication.

  • A trust management based on a cooperative scheme in VANET   Order a copy of this article
    by Ahmed Zouinkhi, Amel Ltifi, Chiraz Chouaib, Mohamed Naceur Abdelkrim 
    Abstract: VANET is a Vehicular ad hoc network, which is highly dynamic, self-organized and without a preexisting infrastructure. VANET works properly if the participating vehicles cooperate to ensure the exchange of packets. This special network confronts many constraints, such as attacks of malicious entity and absence of trust between nodes. To solve these problems, we proposed an approach having a decentralized architecture combining two models: the cluster model and the trust management model. Our approach encourages cooperation between vehicles by broadcast packets using the reward concept. It also ensures the detection of selfish nodes using the trust concept. By applying the punishment mechanism, our approach aimed to prevent malicious nodes from disrupting the network by injecting false information. Besides, in our network, we guaranteed authentic forwarding packets which were controlled by the group leader which takes the function of watchdog. Our approach is based on asymmetric cryptography, which used RSA encryption and digital signature to ensure security. The simulation results provided by the NS3 simulator showed that our approach has better performance.
    Keywords: VANET, Cooperation, Trust management, Security, NS3.

  • Visible Light Communication based High-Speed High-Performance Multimedia Transmission   Order a copy of this article
    by Atul Sewaiwar, Samrat Vikramaditya Tiwari, Yeon-Ho Chung 
    Abstract: A novel scheme for high-speed high-performance multimedia transmission using visible light communication is presented. Initially, the multimedia image is converted to digital data. This digital data is divided into three sub-streams and each sub-stream is then transmitted via three parallel channels. Prior to transmission, each sub-stream data is modulated using On-Off Keying (OOK) modulation. Three parallel channels corresponding to each color (red, green and blue) of a RGB LED are utilized for transmission, thus giving a total of 9 channels. Color filters (CFs) and selection combining (SC) are also utilized for performance improvement and high speed transmission. Simulations are performed to evaluate the effectiveness of the proposed scheme. Results show that the proposed scheme is efficient in terms of bit-error-rate (BER) performance and data rate. Thus, it can effectively be used for high speed multimedia transmission.
    Keywords: free space optical communications; visible light communication; multimedia; multiplexer; demultiplexer

  • Analysis of Energy Aware Job Offloading in Mobile Cloud   Order a copy of this article
    by Junyoung Heo, Hong Min, Jinman Jung 
    Abstract: Mobile cloud computing is the combination of cloud computing and mobile computing, and provides rich computational resources to a mobile computer. In mobile cloud computing, computation offloading techniques are used to overcome the limitations of resource-constrained mobile devices. Offloading techniques perform some parts of a job of mobile devices in the cloud in behalf of mobile devices. If the cost of the operation in that part of the job in a mobile device is larger than the cost associated with offloading, the part is executed in the cloud. Traditional cost analysis models for deciding which parts of a job to execute in the cloud or a mobile device were estimated by using only the cost of offloading, which is composed of data transfer and response time required for the function call. In this paper, we propose a novel offloading cost analysis model based on the data synchronization rate and the data exchange rate for the input of the function to improve the accuracy of offloading cost estimation. We confirm through experiments that the offloading technique with the proposed model can reduce the execution time of a job and consequentially improve the energy efficiency when compared to previous techniques.
    Keywords: Mobile Cloud Computing, Profiling, offloading, Remote Procedure Call

  • Method of Trajectory Privacy Protection Based on Restraining Trajectory in LBS*   Order a copy of this article
    by Zwmao Zhao, Jiabo Wang, Chuanlin Sun, Youwei Yuan, Bin Li 
    Abstract: With the development of mobile positioning technology, location-based services are becoming more and more widely used in the life, but it has produced the security problem of the users privacy leakage. In this paper, the problem of user trajectory privacy protection in location-based services is introduced, and a method of trajectory privacy protection based on restraining trajectory in LBS is proposed. The proposed method is done by restraining the release of sensitive position and choosing a non-sensitive position that the user might stay at with the maximum probability to replace the sensitive position. So it will prevent the leakage of sensitive position of user's trajectory, and protect the user's activity trajectory, and give the method of calculating the privacy protection degree of restraining method.
    Keywords: Location-based service(LBS); privacy preservation; trajectory privacy; restraining trajectory.
    DOI: 10.1504/IJICT.2018.10005451
  • An improved design of P4P based on distributed Tracker   Order a copy of this article
    by Lixin Li, Feng Wang, Wentao Yu, Xiuqing Mao, Mengmeng Yang, Zuohui Li 
    Abstract: The P4P architecture is mainly composed of appTracker, iTracker and Peer. The single appTracker manages sharing resources in the different ISP domains. Every Peer registers with appTracker when joining the network, and then requests resources from appTracker. In this architecture, there is too much workload for the single appTracker, thus, the bottleneck problem often appears when the scale of the network is enlarged in this centralized structure. An improved design of P4P based on distributed Tracker is proposed to solve the overload problem of single appTracker server. In the improved P4P system, a distributed Tracker overlay network replaces the appTracker to manage the resources in the different ISP domains. The functions of iTracker arranged by ISP are extended and the info interfaces of the iTracker are designed in detail in order to realize sharing the resources among the different iTrackers. The experiments prove that the P4P framework based on the distributed tracker can solve the server bottleneck problem and improve the scalability and stability while maintaining the characteristics of locality and transmission capacity.
    Keywords: Proactive network Provider Participation for P2P (P4P);distributed Tracker; overlay network;weighted graph.

  • Chinese-Naxi Syntactic Statistical Machine Translation Based on Tree-to-Tree   Order a copy of this article
    by Shengxiang Gao, Zhiwen Tang, Zhengtao Yu, Chao Liu, Lin Wu 
    Abstract: For the purpose of using Naxi syntax information efficiently, we put forward a method of Chinese-Naxi syntactic statistical machine translation based on the tree-to-tree model. Firstly, for using syntax information of source language and target language, collecting Chinese-Naxi aligned parallel corpus and making a syntax parsing on both side, the method obtains corresponding phrase structure trees of Chinese and Naxi. Then, by using GMKH algorithm to extract a large number of translation rules between Chinese treelets and Naxi treelets, inferring their probabilistic relationship from these rules, it obtains the translation templates. Finally, using these translation templates, through a tree-parsing algorithm, to guide the decoding, translating each Chinese phrase treelet in bottom-up, it obtains the final translation text. In comparison with the tree-to-string model, the experiments show that this method improves 1.2 BLEU value. This proves that both Chinese syntactic information and Naxi syntactic information are very helpful in improving the performance of Chinese-Naxi machine translation.
    Keywords: machine translation; Chinese-Naxi; syntax; tree-to-tree

  • Research on wireless sensor network for mechanical vibration monitoring   Order a copy of this article
    by Liang Zong, Wencai Du, Yong Bai 
    Abstract: The mechanical vibration monitoring system based on the cable connection has characteristics, such as the complexity of the cabling layout, high cost and poor maintainability and system flexibility. This paper introduces wireless sensor network (WSN) into the mechanical vibration monitoring. The monitoring data transmission in the wireless sensor network is completed by radio waves. There are significant advantages that low cost, remote monitoring, facilitates the diagnosis and maintenance. The multi-hop function and topological flexibility of WSN can effectively avoid wireless signal attenuation effect of buildings and equipment. This paper presents a wireless sensor network model for the mechanical vibration monitoring, analyses the two kinds of wireless sensor network topology, and sets up a vibration sensor monitoring network system. The system takes vehicle vibration sensors to collect monitoring data, and combines the ADTCP algorithm for multi-hops network, puts forward a scheme that limits the congestion window to reduce the network congestion. In this paper, the scheme can effectively alleviate the sink nodes congestion in the mechanical vibration monitoring wireless sensor network, and improve the performance of network monitoring.
    Keywords: sensor network; mechanical vibration; vibration monitoring

  • Security and Robustness of a Modified ElGamal Encryption Scheme   Order a copy of this article
    by Karima Djebaili, Lamine Melkemi 
    Abstract: In this paper we propose a new and practical variant of ElGamal encryption which is secure against every passive and active adversary. Under the hardiness of the decisional Diffie-Hellman assumption, we can prove that the proposed scheme is secure against an adaptive chosen ciphertext attacks in the standard model. Such security verifies not only the confidentiality but also verifies the integrity and the authentication of communications. We display that the modified scheme furthermore achieves anonymity as well as strong robustness.
    Keywords: ElGamal encryption; adaptive chosen ciphertext attacks; decisional Diffie-Hellman assumption; robustness.

  • Design and Realisation of a Wireless Data Acquisition System for Vibration Measurements   Order a copy of this article
    by Surgwon Sohn 
    Abstract: Nowadays, sensing, processing, and analysing of vibration signals are key components to structural health monitoring systems. Wireless-based data acquisition (DAQ) systems for vibration measurements become more and more important in the sensing and processing field. This paper presents a hardware and software design of wireless data acquisition system for this purpose. The DAQ system is based on the TMS320 digital signal processor which enables us to process real-time vibration signals. Sensitivity is one of the key performance features in the DAQ system, and we use an integrated circuit piezoelectric (ICP) accelerometer as a detection sensor for best performance. For faster wireless transmission of large amount of acceleration signals, a new data link protocol of Bluetooth interface between wireless DAQ and smartphone is proposed. An Android smartphone is a good choice of user interface in the mobile data acquisition system. In order to display vibration signals in real time at the Android smartphone, a commercial Java graphic library tool is used.
    Keywords: Wireless Data Acquisition System; Vibration Signal; Accelerometer; Data Link Protocol; Smartphone Interface.

  • Weighted Estimation for Texture Analysis based on Fisher Discriminative Analysis   Order a copy of this article
    by Xiaoping Jiang, Chuyu Guo, Hua Zhang, Chenghua Li 
    Abstract: The traditional Texture Analysis methods only use relative contribution of each face area to mark the global similarity. For solving the problem of feature extraction which cause by local instead of global, Weighted Estimation for Texture Analysis method (WETA) based on the Fisher Discriminative Analysis (FDA) is proposed. First, Local Binary Pattern (Local Binary Pattern, LBP) or partial Phase Quantization (Local Phase Quantization, LPQ) is used for image texture encoding. Then, the image is divided into Local small pieces which are all equal and not overlap. The most discrimination axis, which are extracted from similarity space, are applied into texture analysis by FDA method, then the best solution through weight optimization is given. Finally, in the, experiments on two major general face databases (FERET and FEI) verify the effectiveness of the proposed method. The experimental results show that compared with texture methods in other papers, the proposed method in this paper has obtained better recognition performance.
    Keywords: Face recognition; Fisher discriminative analysis; Weighted estimation; Texture coding.

  • A New Lightweight RFID Mutual Authentication Protocol Based on CRC Code   Order a copy of this article
    by Xiaohong Zhang, Juan Lu 
    Abstract: A new lightweight RFID mutual authentication protocol which the security keys updated dynamically is presented by using Cyclic Redundancy Check code (CRC code) operations and some simple logic operations. GNY logic proof and security analysis show that the protocol not only achieve the mutual authentication requirements effectively between the readers and the tags, but also prevent many security privacy problems such as eavesdropping attack, replay attack, replication attack, the problem of tag tracking on the basis of without increasing computation and communication traffic of the RFID system. Especially, the new protocol meets the EPC Class1 Gen2 standard, compared with the same safety degree of other existing protocols, the new protocol is low complexity in the aspect of hardware implementation which accounts for only 18.75% of the reserved space in terms of label storage and it is suitable for low-cost RFID system. Hence, the new protocol can realize the combination of high security and low tag cost.
    Keywords: RFID; CRC code; mutual authentication protocol; GNY logic; security analysis.

  • Application of PatchNet in Image representation   Order a copy of this article
    by Hao Cheng, Zhonglin He 
    Abstract: PatchNet, as a graph model with hierarchical structure, is a new technology of image representation. Its description structure for images can well conform to the cognitive features of human visual system. The semantic information and geometric structural information can be stored and represented compactly. PatchNet can realize abstract representation of an input image. This paper introduces the PatchNet representation of images, describes the detailed structure of PatchNet, and analyses how to apply PatchNet in content searching and editing based on image library.
    Keywords: PatchNet; Image representation; Geometric structure.

  • Experimental study on fibre Bragg grating temperature sensor and its pressure sensitivity   Order a copy of this article
    by Deng-pan ZHANG, Jin WANG, Yong-jie WANG 
    Abstract: At present, ocean temperature almost depends on electric signal inspection, but the sensor is not safe in the water.In order to overcome this shortage and improve measurement veracity and safety, the temperature sensing theory of Bragg grating was analyzed and a new ocean temperature sensor was presented with advantages of all optic elements, underwater safety and feasibility of sensor network, etc. By setting a spring in the metal tube, the temperature was measured with pressure isolation encapsulation. Multiple plus-minus temperature tests were carried out. Results show that the sensor under the condition of no pressure and the temperature of 0~35℃ has an excellent repeatability, hysteresis and linearity. Temperature sensitivity is 29.9pm/℃,which is close to the theoretical value. Pressure sensitivity tests were executed under the condition of 0~5MPa. Results show that the wavelength has no change and is not affected by pressure.
    Keywords: FBG; temperature; sensor; pressure; ocean.

  • Design and Development of Smart fishing Poles using ICT enabled systems   Order a copy of this article
    by Zhenghua Xin, ma Lu, Guolong Chen, Hong Li, Qixiang Song, Meng Xiao 
    Abstract: Consumers wish fish poles to have high quality, diverse functions and personalization. This intelligent float mainly focuses on its functions. Compared with the existing fish poles products, it can fish at night. Because it has the night light. When the floats are put into the river, lake or the sea, the LED light will flash and the green light is easy to recognize. This fish pole can make fish more accurate. And anglers enjoy themselves. This fish pole also has the accelerometer sensor. When the float is pulled by the fish in the water suddenly, the acceleration generates. It makes the light turn into the red. It prompts there are fish on a hook. It is especially convenient for us to use in the evening. Now, the intelligent telephone is very important for people to communicate with others. When the fish bites food on hooks, the SMS will send to the telephone. It tells people to take up the rod line. We deliberately designed the Bluetooth module linked with the floats. So that we play with mobile phones and fish simultaneously. When fish bite the food on the hook, the line triggers the infrared sensor. And then the SMS which content is get fish generates and sends to the angler. The designed alarm works automatically to make anglers enjoy the game. Similarly, the shocked float triggers the infrared sensor and the buzzer works to remind the angler.
    Keywords: Keywords: smart fishing poles; the infrared sensors; the Bluetooth

  • Detection and Filling of Pseudo-hole in Complex Curved Surface objects   Order a copy of this article
    by Mei Zhang 
    Abstract: The detection and filling of pseudo-hole in complex curved surfaces is always a hot and difficult problem in computer vision field. Aiming at the defect that the traditional method can only detect the small curvature of the pseudo-hole region, this paper proposes a new method to improve the pseudo-hole detection. Firstly, the hole can be classified by the projection method, it is divided into simple holes and complex holes, and then, the method of filling the simple hole is described. For complex holes, it is divided into a number of simple holes, and then filling every simple holes, to complete the filling of a complex pseudo-hole, finally map the results of filling the simple hole back to the object hole area.
    Keywords: Laser point cloud; complex curved surfaces object; pseudo-hole detection; pseudo-hole filling

  • A variant of Random WayPoint mobility model to improve routing in Wireless Sensor Networks   Order a copy of this article
    by Lyamine Guezouli, Kamel Barka, Souheila Bouam, Abdelmadjid Zidani 
    Abstract: The mobility of nodes in a wireless sensor network is a factor affecting the quality of service offered by this network. We think that the mobility of the nodes presents an opportunity where the nodes move in an appropriate manner. Therefore, the routing algorithms can benefit from this opportunity. Studying a model of mobility and adapt it to ensure an optimal routing in an agitated network is the purpose of our work. We are interested in applying a variant of the mobility model RWP (named Routing-Random WayPoint "R-RWP") on the whole network in order to maximize the coverage radius of the Base Station (which will be fixed in our study) and thus to optimize the data delivery end-to-end delay.
    Keywords: WSN; wireless sensor networks; Random Waypoint; Mobility model; Routing; RWP; random waypoint.

  • A Modified Extended Particle Swarm Optimization Algorithm to Solve the Directing Orbits of Chaotic Systems   Order a copy of this article
    by Simin Mo, Jianchao Zeng, Weibin Xu, Chaoli Sun 
    Abstract: In order to solve the problem of the poor local search capability of the extended particle swarm optimization algorithm (EPSO),the modify Extended Particle Swarm Optimization algorithm(MEPSO) was proposed, which reduces magnitude of total forces exerting on each particle through decreasing the number of each particle effected by other particles. Meanwhile, the number of the particles removed is analyzed theoretically. And it was proved that MEPSO can converge to the global optimum with the probability 1. Compared with the related algorithms, the presented algorithm can effectively balance the global and local search and improve optimization performances. Finally, MEPSO can better solve the problem of directing orbits of chaotic systems.
    Keywords: Extended Particle Swarm Optimization algorithm; magnitude of total forces exerting on each particle; global and local search.

  • Development and application research of smart distribution district based on IDTT-B new-type transformer terminal unit   Order a copy of this article
    by Aidong Xu, Lefeng Cheng, Xiaobin Guo, Ganyang Jian, Tao Yu, Wenxiao Wei, Li Yu 
    Abstract: Amid at the problems of low automation degree and failed remote monitoring of operation situation in distribution district, an ITDD-B-type of transformer terminal unit (TTU) based smart distribution district (SDD)is designed, which provide scientific and advanced technical methods for operation management units in aspect of achieving fine management in distribution districts. The construction of SDD is stated , i.e. the upgrading and reconstruction work of original distribution district, the construction of new-type SDD based on ITDD-B TTU and the building of communication network and main station. It is focus on design of IDTT-B-type TTU and which can finish transformer monitoring, power quality monitoring, temperature measurement and low voltage switch communication, etc. and has high integration, and supports remote control , communication and software upgrading, and also has low investment and high cost performance. The technical features and total performance and application situation of IDTT-B based SDD are introduced. Finally, an applied example of SDD based on IDTT-B is given and the power quality detection and analysis are made. The construction of new-type distribution district has certain significance for unified building of strong and smart grid, also provides certain guidance and reference for operation and management units.
    Keywords: distribution district; transformer terminal unit; distribution automation; operation management; safe protection; main station.
    DOI: 10.1504/IJICT.2019.10007131
    by Sara Tedmori 
    Abstract: In spite of its success in a wide variety of applications, data mining technology raises a variety of ethical concerns which include among others privacy, intellectual property rights, and data security. In this paper, the author focuses on the privacy problem of unauthorized use of information obtained from knowledge discovered by secondary usage of data in clustering analysis. To address this problem, the author proposes the use of a combination of isometric data transformation methods as an approach to guarantee that data mining does not breach privacy. The three transformation methods of reflection, rotation, and translation are used to distort confidential numerical attributes for the purposes of satisfying the privacy requirements, while maintaining the general features of the cluster in clustering analysis. Experimental results show that the proposed algorithm is effective and provide acceptable values for balancing privacy and accuracy.
    Keywords: Privacy Preserving, Data Mining, Discovering Knowledge, Data Engineering

  • Recognition of the Anti-Collision Algorithm for RFID Systems Based on Tag Grouping   Order a copy of this article
    by Zhi Bai, Yigang He 
    Abstract: In RFID system, one of the problems that we must solve is the collision between tags which is a key technique of the RFID system. An anti-collision algorithm for RFID systems based on tag grouping is put forward. The proposed algorithm compared to conventional ones, when there are a large number of tags in the field, can achieve high system efficiency by restricting the number of unread tags. Simulation results show that the proposed algorithm improves the slot efficiency above 80% at least compared to the conventional algorithms when the number of tags reach 1000.
    Keywords: RFID; anti-collision algorithm; tag grouping; adaptive frame slotted

  • Towards a robust palmprint representation for person identification   Order a copy of this article
    by Meraoumia Abdallah, Bendjenna Hakim, Chitroub Salim 
    Abstract: Biometrics, which refer to automatic identification of individuals based on their physiological and/or behavioral characteristics, is a widely studied field. This identification technology has rapidly evolving and it has a very strong potential to be widely adopted in many civilian applications such as e-banking, e-commerce, and access control. Among the physiological biometric modalities, those based on palm have received the most attention due to its steady and unique features, which are rich in information with a low resolution. Although there is several palm capture devices, however, no of them is apt to provide the full features of the same palm. By using different capture devices, the palm features can be represented with different formats such as: grayscale images, near-infrared images, color images, multispectral images and 3D shapes. In this context, we present in this paper a study that permits to propose robust palmprint representation for a reliable person identification system. Thus, a comparative study of the used palmprint image representations in the practice is performed. A new scheme for improving the person identification using the palmprint images is proposed. The proposed scheme uses the combination of several classifiers results. The discrete CoNtourlet Transform (2D-CNT) is used as feature extraction technique. At each level, the palm image is decomposed into several bands using the 2D-CNT technique. Subsequently, some of resulting bands are used to create the feature vector. Given this vector of features, two sub-systems can be created. The first one is based directly on this vector of features. While the second one uses the Hidden Markov Model (HMM) in order to modeling the feature vector. For the combination of classifiers results, the matching score level fusion strategy is used. The proposed system is tested and evaluated using several databases of Hong Kong polytechnic university that contain 400 users.
    Keywords: Biometrics; Person Identification; Palmprint; Feature extraction; Contourlet transform; Hidden Markov Model; Data fusion.

  • Robust Adaptive Array Processing based on Modified Multistage Wiener Filter Algorithm   Order a copy of this article
    by Peng Wang, Ke Gong, Shuai-bin Lian, Qiu-ju Sun, Wen-xia Huang 
    Abstract: Multistage Wiener filter (MSWF) is a very efficient algorithm for adaptive array processing because of low-complexity and prominent rank-reduction advantage. However, if training sample data was contaminated by outliers, especially when outliers having the same DOA with target emerge, the MSWF results will be decreased severely. In this paper, MSWFs backward iteration was improved, and median cascaded canceler (MCC) strategy was adopted so that optimal weighting calculation can be obtained via sorting and median processing, meaning impact of outliers were removed effectively. Blocking matrix solving of MSWF forward iteration was completed by Householder transform to enhance fix-point format performance. The new-designed algorithm attained excellent compromise between robustness and complexity. To verify presented algorithms performance, array with 50 elements was established in simulation platform, and the simulated results also proved it can cope with outlier-contaminated applications effectively.
    Keywords: Multistage Wiener filter; rank-reduction; Householder transform; outlier; median cascaded canceler;

  • Improving Multidimensional Point Query Search using Multiway Peer-to-Peer Tree Network   Order a copy of this article
    by Shivangi Surati, Devesh Jinwala, Sanjay Garg 
    Abstract: Nowadays, Peer-to-Peer (P2P) networks are widely accepted inrnmultidimensional applications like social networking, multiplayer games, P2Prne-learning, P2P mobile ad-hoc networks etc. Various P2P overlay networksrncombining Multidimensional Indexing(MI) methods are preferable for efficientrnmultidimensional point or range search in a distributed environment. However,rnpoint query search in existing P2P has limitations viz. (i) either doesnt givernsupport to MI or uses replications to support MI or (ii) point query search cost is limited to O(log2N). Hence, traditional MI techniques based on the multiway tree structure (having larger fanout) can be employed to enhance the multidimensional point query search capabilities. Based on our observations, a hybrid model combining m-ary (m fanout of the tree, > 2) P2P tree network and MI based on space containment relationships is preferred to reduce point query search performance bound to O(logmN) using single overlay network. The present paper shows how this model improves the search performance of the point queries in O(logmN) steps, independent of the dimensionality of objects.
    Keywords: Peer-to-Peer overlay networks; Distributed computing; Multidimensional Indexing; Point query search; Multiway trees

  • IoT-based risk monitoring system for safety management in warehouses   Order a copy of this article
    by Sourour Trab, Ahmed Zouinkhi, Eddy Bajic, Mohamed Naceur Abdelkrim, Hassen Chekir 
    Abstract: This paper relies on the concepts and architecture of IoT to design a risk monitoring system for a hazardous product warehouse. The enhancement of product into smart product as a sensor-equipped communicating device allows to control and monitor the product interactions in the objective of risks prevention and avoidance. A generic warehouse safety policy supported by the smart products is presented that relies on a set of parametric safety rules for storage, picking and handling of products. Our proposal aims to provide the benefits of information availability, communication and decision-making, deep in the warehousing physical world, and oriented toward a global safety assurance. We present an implementation case for chemical products warehousing, that uses ZigBee wireless sensor network platform and Labview software. The achievement of smart products and remote monitoring allows dynamic risk assessment by analysis of product's information and status, and ambient condition parameters of warehouse for safety assurance.
    Keywords: IoT; Risk monitoring system; WMS; Intelligent product; Safety management.

  • The Study of Access Point Outdoor Coverage Deployment for Wireless Digital Campus Network   Order a copy of this article
    by Augustinus B. Primawan, Nitin K. Tripathi 
    Abstract: Wireless Local Area Network design needs more development to obtain appropriate and effective results. Site surveys in the design process give realistic results, but require time and effort. Developing ways of predicting signal strength using empirical models can give appropriate results in access point placement to get good signal coverage. Geospatial analysis, such as Inverse Distance Weighting, Kriging and Global Polynomial Interpolation, has been compared. This study showed that Kriging analysis is an appropriate method to predict value the coverage area. Furthermore, predictive signal strength models such as classical, empirical and COST 231 Hatta models have been studied. The empirical model was shown to do the best predictive calculations. The empirical model used to predict signal strength combined with Kriging geographical statistical analysis gave usable signal coverage prediction for access point placement. This model will support GIS spatial analysis tools to perform effective planning in access point placement.
    Keywords: Access Point Placement; GIS Spatial Analysis; Received Signal Level.

  • Feature-Opinion Pair Identification Method in Two-Stage based on Dependency Constraints   Order a copy of this article
    by Shulong Liu, Xudong Hong, Zhengtao Yu, Hongying Tang, Yulong Wang 
    Abstract: Feature-opinion pair identification includes opinion words, opinion targets extraction and their relations identification, is important for analysis online reviews. In this paper, we propose a feature-opinion pair identification method in two-stage based on dependency constraints according to the relationship between the identification of feature-opinion pair and dependency constraints. In the first stage, we construct dependency constraints based on the dependency information of words. Then, dependency constraints and seed words are employed to extract opinion words and opinion targets. In the second stage, we use opinion words and opinion targets extracted in the first stage to construct candidate feature-opinion pairs. Thereafter, integrate dependency constraints, location features and part-of-speech features into support vector machine to identify feature-opinion pair. Our experimental result using online reviews demonstrates that the proposed method is effective in the identification of feature-opinion pairs, and the F-score has reached 83.85%.
    Keywords: opinion mining; opinion word; opinion target; dependency constraints; feature-opinion pair.

  • Low-Complexity LDPC-Convolutional Codes based on Cyclically Shifted Identity Matrices   Order a copy of this article
    by Fotios Gioulekas, Constantinos Petrou, Athanasios Vgenis, Michael Birbas 
    Abstract: In this study, a construction methodology for Low-Density Parity-Check Convolutional Codes (LDPC-CCs) ensembles based on cyclically shifted identity matrices is proposed. The proposed method directly generates the syndrome former matrices according to the specified code parameters and constraints i.e. code-rate, degree-distribution, constraint length, period and memory, in contrast to the majority of the available approaches that produce relevant error-correcting codes based on either block ones, protographs or spatially-coupled type of codes. Simulation results show that the constructed ensembles demonstrate advanced error-correcting capability of up to 0.2 dB in terms of frame-error and bit-error rates at the convergence region, when compared with the performance of error-correcting schemes adopted by various communication standards, with equivalent hardware complexity even at short codeword-lengths. Specifically, the constructed LDPC-CCs have been assessed against the corresponding error-correcting codes used in WiMAX and standards for wireless and wireline telecommunications, respectively.
    Keywords: FEC; LDPC-Convolutional Codes; Complexity; error-correction; WiMAX;; cyclically shifted identity matrices; LDPC-Block Codes; Schedulable memory; syndrome-former.

  • A Big Data and Cloud Computing Specification, Standards and Architecture: Agricultural and Food Informatics   Order a copy of this article
    by N.P. Mahalik, Na Li 
    Abstract: Big data has gone from emerging to a widely used technology in industrial, commercial, research, and database applications. It is used for processing and analyzing massive sets of data to derive useful patterns, inferences, and relations. The real-time data storage and management architecture plays important role. This paper introduces big data, includes the background and definitions, characteristics, related technologies, and challenges associated with implementing big data-based application technologies. This paper also introduces cloud computing, a related yet independent emerging technology and includes the modern technologies and standards, definitions, service and deployment models, advantages and challenges, and development prospects. The paper considers computing specification, standardized procedure, and system architecture in regard to big data systems and could computing.
    Keywords: Big data; KDD; data mining; cloud computing; Virtualization; Architecture.

  • Wi-Fi Received Signal Strength Based Hyperbolic Location Estimation for Indoor Positioning Systems   Order a copy of this article
    by Anvar Narzullaev, MOHD Hasan Selamat, Khaironi Yatim Sharif, Zahriddin Muminov 
    Abstract: Nowadays, Wi-Fi fingerprinting based positioning systems provide enterprises the ability to track their various resources more efficiently and effectively. Main idea behind fingerprinting is to build signal strength database of target area prior to location estimation. This process is called calibration and the positioning accuracy highly depends on calibration intensity. Unfortunately, calibration procedure requires huge amount of time and effort, and makes large-scale deployments of Wi-Fi based indoor positioning systems non-trivial. In this research we present a novel location estimation algorithm for Wi-Fi based indoor positioning systems. Proposed algorithm combines signal sampling and hyperbolic location estimation techniques to estimate the location of mobile users. The algorithm achieves cost-efficiency by reducing the number of fingerprint measurements while providing reliable location accuracy. Moreover, it does not require any additional hardware upgrades to the existing network infrastructure. Experimental results show that the proposed algorithm with easy-to-build signal strength database performs more accurate than conventional signal strength based methods.
    Keywords: Indoor positioning; hyperbolic location estimation; wi-fi fingerprinting; TDOA; trilateration; received signal strength.

  • Target coverage algorithm with energy constraint for wireless sensor networks   Order a copy of this article
    by Liandong Lin, Chengjun Qiu 
    Abstract: As wireless sensor networks are made up of low-cost and low-power tiny sensor nodes, it is of great importance to study on how to both cover targets and save energy consumptions. In this paper, we propose a novel target coverage algorithm with energy constraint for wireless sensor networks. Wireless sensor networks can be described as a graph model, in which nodes and edges represent sensors and maximum signal transmission ranges respectively. Particularly, there types of sensor nodes are utilized: 1) base stations, 2) gateways, and 3) sensors. The main innovations of this paper lie in that we organize the network lifetime by a cycle mode, and divide the network lifetime to rounds of equal period. At the beginning of each round, sensors independently determine which sensing units should be turned on in the working step. Afterwards, the status of each sensing unit is determined by integrating the sensing ability and remaining energy together. Finally, we construct a simulation environment to test the performance of our algorithm. Experimental results demonstrate that the proposed algorithm performs better than the Remaining energy first and Max-lifetime target coverage scheme under various number of sensors and attributes, and performance of our proposed algorithm is next only to integer programming. Furthermore, we also find that the proposed algorithm is able to effectively cover target with low energy consumption.
    Keywords: Wireless sensor networks; Target coverage; Energy constraint; Network lifetime.

  • Safety Message Data Transmission Model and Congestion Control Scheme in VANET   Order a copy of this article
    by Zhixiang Hou, Jiakun Gao 
    Abstract: When VANET meets large traffic density, Beacons produced by periodical safety message may occupy the whole bandwidth of channel, resulting in link congestion. In order to ensure the safety in message data transmission and promote the effectiveness of congestion control, in this paper, we propose an actively safe information congestion control framework via analyzing VANET features which contain channel detection, load estimation, congestion control, sending restoration, and so on. Therefore, we propose a novel congestion control mechanism based on adjusting Beacon frequency and Vehicle communication model. Based on the control theory and the features of VANET, an active safety information congestion structure is put forward at first. Then CACP algorithm is adopted to estimate the link bandwidth for congestion prediction. For periodic status messages and security messages a channel assignment algorithm is also proposed to ensure there is enough channel resource to transfer emergency messages. As a consequence of this, the status message accuracy and the system safety can be ensured and the number of accommodating users can be increased, which will avoid network congestion and improve the channel utilization. From the simulation results, it can be observed that the proposed algorithm can effectively and accurately detect link load degree, improve throughput, decrease delay, reduce networked energy consumption and then guarantee data fidelity. Finally, the conclusion can be drawn that the proposed can achieve a safe and efficient transmission mechanism for VANET
    Keywords: VANET; Congestion control; Channel; Beacon; D-FPAV.

  • Design and Implementation of ITS Information Acquisition System under IoS Environment   Order a copy of this article
    by Ying Zhang, Jiajun Li, Baofei Xia 
    Abstract: The problems and efficient in existing intelligent transportation system (ITS) models are studied and the significance of the model and architecture of ITS under Internet of things (IoS) are also discussed. By the research status at home and abroad about the relationship between IoS and ITS, the necessity of IoS technology introduce to ITS is explained first. Then, based on logical structure and physical model of ITS, the ITS structure model under IoS environment is established. Furthermore, the system architecture on a prefect and comprehensive ITS information system acquisition system is formed. The design principles of system demand, overall design, function modules design, database design are described in detail. The parts of key modules in system are introduce and tested. The tests results show the acquisition system can effectively monitor real-time of vehicle speed, real-time traffic of road vehicles, and obtain effective information of vehicle and road environment. It can also send the information acquired via Internet or GPRS network to data processing center, for future process and intelligent decision of ITS.
    Keywords: ITS; IoS; information acquisition; GPRS; communication.

  • Traffic Route Optimization Based on Clouding Computing Parallel ACS   Order a copy of this article
    by Changyu Li 
    Abstract: Intelligent traffic has demand for massive data environment and high performance processing, which needs cloud computing platform to process massive data, and applying distributed parallel guidance algorithms to improve system efficiency. Therefore, this paper proposes an improved scheme based on clouding computing ACS algorithm. It first adopts MapReduce to parallelize traditional ACS, to process the solving problem with distributed parallel mode, and to improve the defects in ACS. The improved ACS applies Map function to parallelize the part which has the most time consuming, that is, the independent solving process of each ant. Then Reduce function is used to describe the processes of pheromone updating and obtaining better solutions. Simultaneously, for the defects of ACS on long searching time, and premature convergence to a non optimal solution, we integrate simulated annealing algorithm to ACS and provide corresponding realization process. The experiments construct Hadoop cloud computing platform and the improved algorithm is operated and tested on this platform. By the analysis on experimental results, we find the parallel ACS designed by us has improved the query efficiency of the shortest path, which also has advantage on the performance of running time and speedup ratio compared to classic algorithms.
    Keywords: cloud computing; ACS; MapReduce; Traffic network; Pheromone.

  • Optimal configuration of M-for-N shared spare-server systems   Order a copy of this article
    by Hirokazu Ozaki 
    Abstract: In this study, we investigate the user-perceived availability of M-for-N shared spare-server systems. We assume that there are N identical working servers, each serving a single user group, and M identical shared spare servers in the system. We also assume that the time to failure of the server is subject to an exponential distribution, and the time needed to repair a failed server is subject to the Erlang type-k distributions. Under these assumptions, our numerical computation shows that there exists an optimal size (M + N) for shared spare-server systems with respect to the availability and cost for a given condition.
    Keywords: Cloud computing; user-perceived reliability; shared protection systems; probability distribution; availability.

  • Image analysis by efficient Gegenbauer Moments Computation for 3D Objects Reconstruction   Order a copy of this article
    by Bahaoui Zaineb, Hakim El Fadili, Khalid Zenkouar, Hassan Qjidaa 
    Abstract: In this paper, we suggest a new technique for fast computation of Gegenbauer orthogonal moments for the reconstruction of 3D images/objects; A typical comparison of the proposed method with the conventional ZOA methods shows significant improvements in term of error reduction, image quality and consumption time . Then we compare our new approach with an existing methods using Legendre and Zernike moments in the case of 3D image/object. The obtained results prove that Legendre and Zernike moments are slightly better than Gegenbauer, they are still very efficient and gives very good results in terms of MSE and PSNR. But, Zernike moments have higher computational cost than Gegenbauer moments.
    Keywords: Gegenbauer Moments computation; Legendre Moments; Zernike moments; 3D images/object; Computation time.

  • Integration of a quantum scheme for key distribution and authentication within EAP-TLS protocol   Order a copy of this article
    by GHILEN AYMEN, Mostafa AZIZI 
    Abstract: The extensive deployment of wireless networks has led to a significant progress in security approaches that aim to protect confidentiality. The current methods for exchanging a secret key within Extensible Authentication Protocol-Transport Layer Security (EAP-TLS) protocol is based on Public Key Infrastructure (PKI). Although this technique remains one of the most widely implemented solution to authenticate users and to ensure secure data transmission, its security is only computational. In other words, by the emergence of the quantum computer, the existing cryptosystems will become completely insecure. Improving the contemporary cryptographic schemes by integrating quantum cryptography becomes a much more attractive prospect since its technology does not rely on difficult mathematical problems such as factoring large integers or computing discrete logarithms. Thus, we propose a quantum extension of EAP-TLS that allows exchanging a cryptographic key and authenticating a remote client with unconditional security, ensured by the laws of quantum physics. PRISM tool is applied as a probabilistic model checker to verify specific security properties for the new scheme.
    Keywords: EAP-TLS; Quantum Cryptography; Authentication;Key Agreement; Entanglement; PRISM; Model Checking.

  • A rapid detection method of earthquake infrasonic wave based on decision-making tree and the BP neural network   Order a copy of this article
    by Yun Wu, Zuoxun Zeng 
    Abstract: In this paper, a rapid detection method of earthquake infrasonic wave combining decision-making tree and neural network is proposed. This method is designed for the automated monitoring system of earthquake occurring and advance forecast. Firstly, different kinds of signal data are collected together and analyzed to find the most meaningful attributes, which is important to describe the features of signal. Then, in the decision part, two decision line are designed to supply a final result. In the first one part, these important attributes are chosen to determine to be the nodes of decision-making tree. In the second part, many previous stored signals are analyzed utilizing the neural network to build the mapping model between attributes of signals and its classification. Lastly, the most suitable nodes sequence and the thresholds are determined in this process according to two experiments. Experiments analysis is also presented in this paper to discuss some important issues in decision tree, and determine the decision system finally.
    Keywords: Infrasonic Wave; Earthquake; Decision-making Tree; Neural NetWork.

  • Data Dissemination on MANET by repeatedly transmission via CDN nodes   Order a copy of this article
    by Nattiya Khaitiyakun, Teerapat Sanguankotchakorn, Kanchana Kanchanasut 
    Abstract: Recently, many researches on MANET (Mobile Ad Hoc network) have been carried out due to its various applications in information exchange. The efficient data dissemination in such an infrastructure-less environment as MANET is considered as one of the challenging issues. This paper proposes to adopt the concept of CDN (Content Delivery Network) Technique, normally used in Internet, for disseminating information in MANET. The source node disseminates data to surrounding nodes by repeatedly transmitting batches of packets via a set of CDN nodes acting as relay nodes. Our proposed data dissemination technique via CDN nodes is developed based on the OLSR (Optimized Link State Routing Protocol) on MANET. The limited number of CDN nodes is selected from MPRs (Multi-Point Relay) in OLSR in order to optimally cover all subscriber nodes and to avoid the interference problem as well. The packets are transmitted from CDN to destination nodes using the broadcasting technique the same technique as the one adopted in MPR, a broadcasting technique. In this work, the performance of our proposed technique is evaluated in terms of probability of successful transmission by simulation using NS3. The performance is compared with the typical OLSR and the recent work called Clustering-based data transmission algorithm in VANET (Vehicular Ad Hoc Network). It is apparent that our proposed algorithm improves drastically the overall probability of successful transmission when comparing with the typical OLSR. Additionally, it achieves higher probability of successful transmission at high nodes density when comparing with Clustering-based data transmission algorithm in VANET. Finally, the closed-form mathematical expression of the probability of successful transmission of our proposed algorithm in multi-hop network environment is derived and verified.
    Keywords: Content Delivery Network(CDN); MANET; OLSR.

  • Suboptimal Joint User Equipment Pairing and Power Control for Device-to-Device Communications Underlaying Cellular Networks   Order a copy of this article
    by Chaoping Guo, Xiaoyan Li, Wei Li, Hongyang Li 
    Abstract: Device-to-device(D2D) communications underlying cellular networks, results in the cellular interference to D2D User Equipment(DUE) which is larger than the D2D interference to Cellular User Equipment(CUE). A joint resource allocation scheme is presented to both perform user equipment pairing and power allocation to minimize total interference of DUE and CUE. The scheme is composed of two parts: in the first one the base station assigns power to each CUE and each D2D transmitter by graphic method, and in the second one it selects the optimal CUE to pair with D2D pair by modified Hungarian algorithm in order to minimize total interference. The simulation results show that the proposed scheme can not only decrease the interference caused by D2D pair and that caused by CUE, but also increase the number of permitted D2D connections.
    Keywords: device-to-device communications; power control; resources allocation; Suboptimal Joint; Cellular Networks; User Equipment.

  • A Cognitive Approach of Collaborative Requirements Validation based on Action Theory   Order a copy of this article
    by Sourour MAALEM, Nacereddine ZAROUR 
    Abstract: The requirements must be validated at an early stage of the analysis. Requirements validation usually involves natural language processing, which is often inaccurate and error-prone, or translated in formal models, which are difficult to understand and use for non-technical stakeholders. The majority of existing approaches for validating requirements are used in a heterogeneous process, using a variety of techniques, relatively independent without any methodological or cognitive approach through in which, mechanisms of human thought or artificial are used. In this work we present a cognitive approach to collaborative requirements validation based on the theory of action, through a set of steps that must increase the involvement of the Client in this stage of the engineering cycle; and to bring the customer mental model to the analyst one. In the proposed process, the analyst starts by extracting needs one by one from requirements documents. Each need goes through a step of formulating the intention, which results in a transformation of needs into requirements. This transformation is performed by respecting a new syntax in any generated a checklist (quality attributes) from the viewpoint of each stakeholder, followed by a step of specification of actions that engenders on the basis of the intentions, actions which the analyst perceives and verifies to build a rapid prototype of software interface, executable on machine. The customer perceives the prototype, interprets it and validates these needs. A valid needs database is created, needs that remain invalid must be negotiated, and if conflicts persist, another error base is created. At the end of this collaborative process of requirements validation, decisions will be made concerning who must participate during needs validation meetings with respect to the Mental Effort metric which will classify according to the mental difficulty of execution of the prototype, the articulatory and semantic problems, and measure commitment and motivation of stakeholders.
    Keywords: Requirements engineering; Requirements validation; Action Theory; Prototype; Cognitive Approach.

  • Approach to Terrain Pretreatment for the Yazidang Reservoir Based on Image Processing   Order a copy of this article
    by Lingxiao Huang 
    Abstract: The terrain pretreatment based on image processing in the Yazidang Reservoir is presented. The image processing is based on the image stitching and the image edge detection. The SURF algorithm is used to extract feature points of reservoir partial images. To match feature points and stitch reservoir partial images, the wide view reservoir image is stitching using KD-tree algorithm and linear gradient fusion algorithm. The improved mathematical morphology is adopted, and five different shapes and four different scales of structural elements is implemented to obtain precise waterfront edge of reservoir image. The *.dxf file is obtained by using CAD software to depict reservoir shore contours. It transforms *.dxf into *.kml to extract latitude and longitude information of reservoir waterfront edge. The satisfied mesh effect is generated by using the Delaunay triangulation algorithm.The result shows that accurate and detailed reservoir terrain can be obtained using the image processing method and the GE software, which provides a prerequisite for the numerical simulation of sedimentation in the Yazidang Reservoir.
    Keywords: terrain pretreatment; image processing; SURF algorithm; KD-tree algorithm; linear gradient fusion algorithm; mathematical morphology; Delaunay triangulation.

  • A novel imbalanced data classification algorithm based on fuzzy rule   Order a copy of this article
    by Zhiying Xu, YiJiang Zhang 
    Abstract: The classification of imbalanced data can increase the comprehensibility and expansibility of data and improve the efficiency of data classification. The accuracy of classification is poor when the data is classified by the current method for imbalanced data analysis of big data. To this end, this paper presents an imbalanced data classification algorithm based on fuzzy rule. The algorithm firstly collects the imbalanced data, selects the features of the imbalanced data, and optimizes the imbalanced data classification algorithm by using the fuzzy rule classification algorithm..The experimental results show that when the classifier maintains a certain size of the weak classifier, the classification accuracy of the proposed algorithm will be gradually improved as the training time increases, and gradually be stable within a certain range of accuracy, this method can improve the accuracy of imbalanced data classification.
    Keywords: Imbalanced data; Data feature selection; Data classification.

  • A Personalized Recommendation Algorithm Based on Probabilistic Neural Networks   Order a copy of this article
    by Long Pan, Jiwei Qin, Liejun Wang 
    Abstract: Collaborative filtering is widely used in recommendation system. Our work is motivated by the observation that users caught in their attention relationship network, and their opinions about items will be directly or indirectly affected by others through such a network. Based on behaviors of users with similar interest, the technique focuses on the use of their opinions to recommend items. Therefore, the quality of similarity measure between users or items has a great impact on the accuracy of recommendation. This paper proposes a new recommendation algorithm with graph-based model. The similarity between two users (or two items) is computed from the connections on graph with nodes of users and items. The computed similarity measure is based on probabilistic neural networks to generate predictions. The model is evaluated on a recommendation task which suggests that which videos users should watch based on what they watched in the past. Our experimental results on the YouKu and Epinions datasets demonstrate the effectiveness of the presented approach in comparison with both collaborative filtering with traditional similarity measures and simplex graph-based methods and further improve user satisfaction, our approach can better improve the overall recommendation performance in precision, recall and coverage.
    Keywords: recommendation system; similarities; graphs-based approach; collaborative filtering; probabilistic neural networks.

  • Temporal Impact Analysis and Adaptation for Service-Based Systems   Order a copy of this article
    by Sridevi Saralaya, Rio D'Souza, Vishwas Saralaya 
    Abstract: Temporality is an influential aspect of Service-Based Systems (SBS). Inability of a service to achieve time requirements may lead to violation of Service-Level Agreements (SLAs) in a SBS. Such non-conformity by a service may introduce temporal inconsistency between dependent services and the composition. The temporal impact of the anomaly on related services and also the composition will need to be identified if SLA violations have to be rectified. Existing studies concentrate on impact analysis due to web service evolution or changes to a web service. There is a huge lacunae regarding studies on impact of time delay on temporal constraints of dependent services and obligations of business process. Although reconfiguration of SBS to overcome failures is extensively addressed, reconfiguration triggered due to temporal delay is not well explored. In this study we try to fill the gap between reconfiguration and impact analysis invoked due to temporal violations. Once the impacted region and the amount of temporal deviation to the business process are known, we try recovery by localizing the reconfiguration of services to the impacted zone.
    Keywords: Service-Based Systems; Impact-Analysis; Proactive Adaptation; Reactive Adaptation; Reconfiguration; Cross-Layer Adaptation; SLA violation handling; Anomaly handling; Service-Based Applications.

  • Considering the environment's characteristics in wireless networks simulations: case of the simulator NS2 and the WSN   Order a copy of this article
    by Abdelhak EL MOUAFFAK, Abdelbaki EL BELRHITI EL ALAOUI 
    Abstract: Recently, the wireless networks, particularly the wireless Sensor Networks (WSN) occupy an important place in several application areas due to the progress in microelectronics and wireless communications domains. Thus, a set of researches had addressed this issue in order to broaden the possibilities offered by these networks and circumvent the encountered problems. The test of any new solution is an essential phase to validate its performances. This phase is done in network simulators; which NS is the most used. The impact of the physical layer and the radio signal propagation environment criteria on the simulations results is indisputable. In this context, and after presenting and classifying the radio propagation models, we study in detail the implemented models in NS-2. The focus is on the ability of these models to consider the characteristics of the wireless networks deployment environment (e.g. nature, position and mobility of the obstacles). And to consider the specificities of WSN, the effect of other parameters (e.g. antenna height) will be discussed.
    Keywords: Wireless network; Wireless sensor network; Simulation; Network Simulator; NS-2; Radio propagation model; Deployment environment.

  • Improved Biometric Identification System Using a New Scheme of 3D Local Binary Pattern   Order a copy of this article
    by KORICHI Maarouf, Meraoumia Abdallah, Aiadi Kamal Eddine 
    Abstract: In any computer vision application, integration of relevant feature extraction module is vital to help in making accurate decision ofrnthe classification. In the literature, several methods that havernachieved promising results and high accuracies are based on texturernanalysis. Thus, there exist various feature extraction techniques torndescribe the texture information, among them; the Local BinaryrnPattern (LBP) is widely used to characterize the image sufficiently.rnGenerally, LBP descriptor and their variants are applied onrngray-scale images. Thus, in this paper, we propose a new methodrnthat can be applied to any type of image either in grayscale, color,rnmultispectral or hyperspectral. It is a new scheme of 3D LocalrnBinary Pattern. We have developed biometric system for personrnidentification and an edge detection technique to evaluate it. Thernobtained results have showed that it has higher performancesrncompared to other methods developed in the literature in terms ofrnidentification rates.
    Keywords: Feature extraction;Local Binary Pattern (LBP);Biometrics;Person identification;Palmprint;Data fusion.

  • An Agricultural Data Storage Mechanism Based on HBase   Order a copy of this article
    by Changyun Li, Qingqing Zhang, Pinjie He, Zhibing Wang, Li Chen 
    Abstract: With the development of agricultural space-time localization, sensor network and cloud computing, the amount of agricultural data is increasing rapidly and the data structure becomes more complicated and changeable. Currently, the widely used agricultural database is the relational database. This database handles a large amount of data with very limited throughput and is not suitable for organization and management of distributed data. Hbase is a non-relational database of distributed file storage built on Hadoop platform. Hbase is suitable for unstructured data storage database and it can handle large volume of data with high scalability. To better store agricultural big data in Hbase, we propose a special agricultural data buffer structure which stores data based on the data category and a two-level indexing memory organization strategy on HBase. The method proposed saves more than a quarter of the time compared to traditional buffering methods. Experimental results show the higher efficiency of the agricultural data buffer structure and memory-organization strategy.
    Keywords: agricultural big data; data buffer structure; HBase; two-level indexing strategy.

  • A Real-time Multi-Agent System for Cryptographic Key Generation using Imaging-based Textural Features   Order a copy of this article
    by Jafar Abukhait, Ma'en Saleh 
    Abstract: Traditional network security protocols depend on exchanging the security keys between the network nodes, and thus opposing the network to different classes of security threats. In this paper, a multi-agent system for cryptographic key generation is proposed for real-time networks. The proposed key generation technique generates a 256-bit security key for the Advanced Encryption Standard (AES) algorithm using the textural features of digital images. By implementing this key generation technique at both the sender and receiver network nodes, the process of exchanging the security keys through the network would be eliminated, and thus making communication between network nodes robust against different security threats. Simulation results over a real-time network show the efficiency of the proposed system in reducing the overhead of the security associations performed by the IPsec protocol. On the other hand, the proposed agent-based system shows a high efficiency in guaranteeing the quality-of-service (QoS) of the real-time requests in terms of miss-ratio and total average delay through applying the best scheduling algorithm.
    Keywords: Cryptography; Textural Features; Gray Level Co-occurrence Matrix (GLCM); Advanced Encryption Standard (AES); QoS; Security Key; Network Node.

  • Modeling of learner behavior in massive open online video-on-demand services   Order a copy of this article
    by Ji-Wei Qin, Xiao Liu 
    Abstract: Video-on-demand service as a popular Internet application provides lively learning resource, learner can freely selects and watches his/her interesting videos in massive open online education. Learner video-on-demand behavior as feedback shows preference among learners is available to help video provider to design, deployment and manage learning video in massive open online VoD services. In this paper, we collected the learner video-on-demand behavior reports in 875 days, on the basis of real-word data, the learner video-on-demand model is presented in massive open online VoD service. Three main findings are proposed. 1) The Educational video popularity matches better with the Stretched Exponential model than the Zipf model. 2) The long-session educational video attends with the less-popularity. 3) The Poisson distribution is considered the best fit for the arrival learner in massive open online vod services. Educational video popularity distribution would be helpful to define the number copy of educational video file for deployment on video-on-demand server. Session and arrvial pattern would be helpful to design the content of educational video in massive open online VoD services.
    Keywords: learner behavior; video-on-demand; massive open online.

  • Research on equalization scheduling algorithm for network channel under the impact of big data   Order a copy of this article
    by Zheng Yu, Dangdang Dai, Zhiyong Zha, Yixi Wang, Hao Feng 
    Abstract: In order to improve the equalization scheduling ability of network channel, an equalization scheduling algorithm for big data network communication based on the baud-spaced equilibration and decision feedback modulation technology is proposed in this paper. With this algorithm, a model for network communication channel under the impact of big data is constructed to analyze the multipath characteristics of network channel; coherent multipath channel modulation method is used to conduct wave filtering to intersymbol interference and adaptive baud-spaced equilibration technology is also used to design channel equalization; the model for tap delay line of channel is used for multipath suppression of network channel, and the decision feedback modulation technology is also used for network channel equalization scheduling to overcome the impact of phase shift caused by big data impact on channel and improve channel equalization. Simulation results show that when the proposed algorithm is used for network channel equalization scheduling, the fidelity of symbols output through network communication is good and the bit error rate is low, and the performance of network channel equalization scheduling under the impact of big data and multipath is good, which improves the robustness of network channel.
    Keywords: big data; multipath effect; network channel; equalization scheduling; baud space; modulation \r\n\r\n.

  • Study on high accurate localization using multipath effects via physical layer information   Order a copy of this article
    by Yajun Zhang, Yawei Chen, Hongjun Wang, Meng Liu, Liang Ma 
    Abstract: Device-free passive localization (DFL) has shown great potentials to localizing target(s) without carrying any device in the area of interests (AoI). It is especially useful for many applications, such as hostage rescuing, wildlife monitoring, elder care, intrusion detection, etc. Current RSS (received signal strength) based DFL approaches, however, can only identify underlying the prerequisite that the signal collected is mainly conveying along a direct line-of-sight (LOS) path, but cannot perform well in a typical indoor building with multipath effects. This paper explains the fine-grained CSI-based localization system that is effective to locate within multipath environment and non-line-of-sight scenarios. The intuition underlying our design is that CSI (Channel State Information) benefits from the multipath effect, because the received signal measurements at different sampling positions will be the combination of different CSI measurements. We adapt the improved maximum likelihood method to pinpoint single target's location. Final, we propose a prototype of our design utilizing commercial IEEE 802.11n NICs. Results from the experimental scenes of a Lobby and a laboratory of our university, comparing to RSS-based and CSI-based scheme, demonstrate that our design can locate best single target with the accuracy of 0.95m.
    Keywords: device-free passive localization; channel state information; improved maximum likelihood methodrnrn.

  • A fast particle filter pedestrian tracking method based on color, texture and corresponding space information   Order a copy of this article
    by Yang Zhang, Dongrong Xin 
    Abstract: A fast particle filter pedestrian tracking method based on color, texture and corresponding space information is proposed. In this algorithm, firstly, we extract space information of object pedestrian and disintegrate it into three local regions. In addition, employ the improved texture and color information extract algorithm to get the joint texture and color information from the corresponding sub-region. Finally, determine the position of object by color-texture similarity indicator based on space division, and get the result of accurately track. In consideration of the multi thread information fusion algorithm need a larger number of particles, this factor could reduce the computational efficiency. Therefore, a wave integral histogram algorithm is proposed for improving arithmetic speed. The experiment carried out on videos, result indicates the effectiveness and efficiency of the proposed method, which can achieve higher accuracy than other tow state of the art algorithms in the actual traffic scene, and the real-time performance also has been improved considerably.
    Keywords: pedestrian tracking; particle filter; integral histogram; texture information.

  • Velocity Monitoring Signal Processing Method of Track Traffic Based on Doppler Effect   Order a copy of this article
    by Xiaojuan Hu, Tie Chen, Nan Zhao 
    Abstract: In order to improve the monitoring efficiency of track traffic speed, a signal processing method based on Doppler effect is purposed to meet the accurate velocity measurement of high speed trains. As a result, the track traffic radar monitoring signal processing method based on the Doppler effect is studied. First of all, the Doppler effect and the Doppler principles are analyzed. Then, the research of signal processing algorithm of rail traffic radar system is focused on, and the improvement method of a real sequence FFT (fast Fourier transform) imaginary part arithmetic algorithm is put forward. The process of FFT algorithm is simplified, and finally, through the Matlab software simulation, the improved FFT algorithm spectrum analysis effect is further verified. The test results showed that the use of improved FFT algorithm ensured the measurement accuracy, and improved the Doppler frequency calculation speed. In addition, it can meet the processing requirements of rail traffic radar velocity measurement system on the rail traffic velocity monitoring signal.
    Keywords: Signal processing; Doppler effect; Track traffic; FFT algorithm.

  • Research and Application on Logistics Distribution Optimization Problem using Big Data Analysis   Order a copy of this article
    by Yuming Duan, Hai-tao FU 
    Abstract: The optimization of logistics distribution center location is discussed under the environment of big data. Then the features of logistics distribution and algorithm design idea are provided using basic platform of MapReduce and integrated with data mining clustering algorithm. A clustering analysis algorithm based on geodesic distance is proposed, combined with the features of logistics, also with the parallel algorithm design and improved scheme based on MapReduce. It is considered in real situation, there is no linear distance between nodes, while Dijkstra distance can measure the actual distance between two points, so GDK-means clustering algorithm is put forward. The improved clustering algorithm gets parallel design to process a lot of unstructured data in big data of logistics. In parallel design, taking into account the complexity of algorithm and time efficiency, the parallelization algorithm is improved. Our clustering algorithm can be implemented on logistics distribution center location problem. It is verified to provide a decision scheme for any logistics route optimization in logistics distribution chain according to the size of the granularity of space division.
    Keywords: logistics distribution; center location; k-means; geodesic distance; big data.

Special Issue on: Collaboration Technologies and Systems for Disaster Management

  • RTCP: a redundancy aware topology control protocol for wireless sensor networks
    by Bahia Zebbane, Manel Chenait, Chafika Benzaid, Nadjib Badache 
    Abstract: Topology control-based sleep-scheduling aims at exploiting node redundancy to save energy and extend the network lifetime, by putting as many nodes as possible in sleep mode, while maintaining a connected network. In this paper, we propose a redundancy aware topology control protocol (RTCP) for a wireless sensor network which exploits the sensor redundancy in the same region. This is achieved by dividing the network into groups so that a connected backbone can be maintained by keeping a necessary set of working nodes and turning off the redundant ones. RTCP allows applications to parameterise the desired connectivity degree. It identifies node redundancy, in terms of communication; it groups redundant nodes together according to their redundancy degrees and threshold of connectivity level. Finally, it schedules nodes in groups for active or sleep mode. The simulation results illustrate that RTCP outperforms some other existing algorithms, in terms of energy conservation, network lifetime and connectivity guarantee.
    Keywords: wireless sensor networks; WSNs; topology control; energy conservation; duty-cycling; connectivity; node redundancy; leader election.
    DOI: 10.1504/IJICT.2018.10004619
  • Coordinated route reconfiguration for throughput optimisation under Rician fading channel
    by Adnan Fida, Nor Tuah Jaidi, Trung Dung Ngo 
    Abstract: This paper focuses on the high throughput data transmission to deliver ample sensor data such as images or videos in a post disaster scenario using mobile wireless sensor networks. We formulate the route optimisation problem in the presence of Rician fading channel. The coordinated route reconfiguration strategy integrates the communication quality, shortest path algorithm, particle swarm optimisation, and mobility of multiple routers to optimise the end-to-end throughput of data transmission routes. We show how a coordinator can be used for identification of routers with critical links and to gradually manoeuvre them towards a reconfigured route with higher end-to-end throughput. The evaluation of our solution shows that proposed strategy considers the inherent characteristics of Rician fading channel, and takes benefit of the router's mobility to provide routes with better performance as compared to the routes generated by non-coordinated route reconfiguration framework.
    Keywords: throughput optimisation; Rician fading channel; communication aware; particle swarm optimisation; PSO; mobility.
    DOI: 10.1504/IJICT.2018.10004623
  • Coverage enhancement with occlusion avoidance in networked rotational video sensors for post-disaster management
    by Nawel Bendimerad, Bouabdellah Kechar 
    Abstract: Wireless video sensor networks (WVSNs) have recently emerged as a new class of sensor networks in which large amounts of visual data are sensed. This kind of networks, strengthened with rotation capability for each video sensor, is envisioned to be deployed to monitor a plethora of real-world phenomena. In this work, we propose a model of WVSN for post-disaster management in order to assist search and rescue operations by locating survivors and identifying risky areas. Rotatable video sensors are randomly deployed and used to switch to the best direction with the purpose of getting high coverage while avoiding obstacles. To address fault tolerance problem in WVSN, we build potential cover sets to unsure field of view's coverage of failed nodes. We have conducted a set of comprehensive experiments using OMNeT++ simulator and the obtained results reveal that our proposal gives better performance in terms of coverage enhancement and fault tolerance.
    Keywords: wireless video sensor network; WVSN; coverage; scheduling; fault tolerance; occlusion; obstacles avoidance; OMNeT++.
    DOI: 10.1504/IJICT.2018.10004617
  • Architecture for gathering and integrating collaborative information for decision support in emergency situations
    by Tiago Marino, Maria Luiza Machado De Campos, Marcos R.S. Borges, John G. Breslin, Maciej Dabrowski 
    Abstract: The involvement of citizens in supporting crisis situations is no longer a new phenomenon. In the past, the lack of data was one of the main barriers faced by public managers in the decision-making process. Today the situation has been reversed, such that the challenge faced is managing an excessive mass of data, which is totally dynamic and originating from different sources such as remote environmental sensors, social networks, response teams in the field. During an emergency response, the concern is no longer being able to collect data for a better understanding of the affected environment, but in knowing how to organise, aggregate and separate what is actually useful for crisis managers. This research proposes a collaborative information architecture that considers aspects from environmental complexity in the context of emergency scenarios, in order to support response teams with decision making, through gathering and integrating information that has originated from different media resources and, hence, enriching a decision team's 'current contextual knowledge' base.
    Keywords: emergency; heterogeneous information sources; complex systems; information integration; crisis and disaster management; social media; collaboration; domain vocabulary; decision support; data integration.
    DOI: 10.1504/IJICT.2018.10004620
  • A framework combining agile, user-centred design and service-oriented architecture approaches for collaborative disaster management system design
    by Karima Ait Abdelouhab, Djilali Idoughi, Christophe Kolski 
    Abstract: Disaster management is a special type of human complex organisation in which heterogeneous human actors belonging to different authorities collaborate and work together with the shared aim to solve, or at least reduce, the disaster situation. Thus, the collaboration in this case within team members and with other teams operating at the disaster site(s) is very critical and complex; the achievement of the desired goal heavily depends on this collaboration. Interactive and easy to use services in these scenarios are very valuable and necessary as they can improve collaboration, coordination, and communication amongst team(s) to achieve the desired goals. For this purpose, in this paper, we propose a novel design framework for complex disaster management systems. We combine agile characteristics and principles, user-centred techniques and service-oriented architecture paradigm. Our aim is to take into account the needs of the disaster managers in an iterative development process, to improve the human actors' involvement in the design projects, to offer the possibility to accept any changes in order to produce highly usable and interactive service-based collaborative services.
    Keywords: user-centred design; UCD; agile method; service-oriented architecture; service design; disaster management; collaboration; information technology.
    DOI: 10.1504/IJICT.2018.10004621
  • Constructing collective competence: a new CSCW-based approach
    by Djalal Hedjazi 
    Abstract: Within the majority of contexts, it is persons that are considered to be competent or incompetent. However, in many cases it is the performance of groups and teams that is most important. This implies a concept of collective competence that integrates the set of skills in a group. In addition, the collective competence construction process is also enriched through collaboration which implies exchanges, confrontations, negotiations and interpersonal interactions. This paper presents our CSCW-based approach supporting collective competence construction. As a case of study, the industrial maintenance workspace is fundamentally a collaborative context. Our contribution in this area led us first, to analyse the related task in order to highlight collaborative maintenance vital needs and design the appropriate required group awareness supports which will be used to support collective competence. Finally, the experimentation study identifies the highly effective group awareness tools.
    Keywords: computer-supported cooperative work; collective competence construction; groupware assessment; collaborative e-maintenance; group communication technology.
    DOI: 10.1504/IJICT.2018.10004618
  • A hybrid ad hoc networking protocol for disaster recovery resource management
    by Aris Ouksel, Doug Lundquist 
    Abstract: Following a major disaster, the infrastructure supporting wired and mobile networking is expected to be inoperable over large areas. Thus, emergency response teams must communicate using their own networking equipment. A large-scale peer-to-peer network offers fast and flexible deployment but requires cooperation among nodes. Usage constraints must be imposed to prevent overloading the shared network capacity. In particular, disparate communication models (for route building and optimisation, resource management, and localised status flooding) must be integrated. We propose a hybrid network protocol which dynamically assigns network capacity to these three communication models by imposing transmission delays in accordance with their attempted usage rates.
    Keywords: disaster recovery; communication systems; ad hoc networking; hybrid protocol design; contingency theory.
    DOI: 10.1504/IJICT.2018.10004622

Special Issue on: Recent Developments in Wireless and Optical Communication Networks

  • Medium access control in wireless sensor networks: a survey
    by Mohamed Hefeida, Ashfaq A. Khokhar 
    Abstract: Wireless sensor networks (WSNs) are being integrated across a wide spectrum of military, commercial, and environmental applications, such as field surveillance, environmental monitoring, and disaster management. This variety in WSN applications led to the development of a large number of medium access control (MAC) protocols with different objectives. A common goal of these protocols however, is preserving energy in order to maximise network lifetime. In an effort to facilitate studying existing MAC protocols and developing novel medium access techniques, this paper presents a classification and critical review of existing MAC protocols adopted in different WSN environments. We classify these protocols based on channel access into three main categories: contention-based, contention-free, and hybrid protocols. Cross-layer protocols (involving MAC layer) are also studied. We describe major characteristics of these classes, differences among them, possible improvements, and outline ongoing and future research challenges.
    Keywords: medium access control; MAC; duty cycling; energy efficiency; disaster management; contention; wireless sensor networks; WSNs.
    DOI: 10.1504/IJICT.2018.10006490

Special Issue on: Information Technology for Organisation Development

  • A new multi-criteria decision process to prioritise requirements
    by Mohamed Amroune, Nacereddine Zarour, Pierre-Jean Charrel 
    Abstract: Most software projects have more candidate requirements than can be realised within the time and cost constraints. This paper presents a novel multi-criteria decision analysis process to prioritise requirements. The novelty of the presented idea is three-fold. Firstly, to prioritise requirements, it distinguishes two categories of requirements according to their level of abstraction: low-level requirements and high-level requirements. Then, it relates business goals to the requirements of low-level which contribute to their fulfilment. This will improve the completeness and traceability of requirements. Secondly, requirements prioritisation is based on their degree of contribution to the identified business goals and their importance. So, business goals become the evaluation criteria. Finally, this process takes into account relationships and dependencies that may exist between business goals. At this end, we employ analytic hierarchy process (AHP) method to give weights to business goals and Choquet integral to calculate a global score, i.e. priority, for requirements.
    Keywords: requirements prioritisation; decision analysis; fuzzy measure; Choquet integral; AHP.
    DOI: 10.1504/IJICT.2018.10010448
  • An empirical study of clone detection in MATLAB/Simulink models
    by Dhavleesh Rattan, Rajesh Bhatia, Maninder Singh 
    Abstract: Complex systems consisting of millions of components are very difficult to develop and manage. Thus model driven development has become an essential development paradigm. But very large scale models suffer from unexpected overlaps of parts. The overlapped and copied fragments in models are known as model clones which increase maintenance cost and resource requirements. Recent research has shown the presence of clones in MATLAB/Simulink models. To gain better understanding, we have conducted an in-depth empirical study on 18 MATLAB/Simulink models using ConQAT, an open source clone detection framework. Our study shows that there is a significant cloning in models and find out some interesting patterns of clones which are significant to improve maintenance.
    Keywords: model clone detection; empirical study; software maintenance; MATLAB Simulink models; ConQAT.
    DOI: 10.1504/IJICT.2018.10010451
  • De-noising by Gammachirp and Wiener filter-based methods for speech enhancement
    by Hajer Rahali, Zied Hajaiej, Noureddine Ellouze 
    Abstract: In this paper, we propose a method for enhancing of speech corrupted by noise. The new speech enhancement approach combines RASTA, Wiener filter (WF) and the Gammachirp filter (GF) in series connection to construct a two-stage hybrid system (named RASTA-WF-GF) in frequency domain to enhance the speech with additive noise. It is shown that the proposed method significantly outperforms, spectral subtraction (SS), Wiener filter (WF), Kalman filter (KF) and RASTA speech enhancement methods, in the presence of noise.
    Keywords: Gammachirp filter; Wiener filter; robust speech recognition; noise reduction.
    DOI: 10.1504/IJICT.2018.10010446
  • Remote sensing image retrieval using object-based, semantic classifier techniques
    by N. Suresh Kumar, M. Arun, Mukesh Kumar Dangi 
    Abstract: Data captured by satellites is increasing exponentially for agriculture and crop management, health, climate changes and future prediction of plants. Nowadays, it is required to have reliable, automated, satellite image classification and recovering system. Every day there is a massive amount of remotely-sensed data being collected and sent by satellite. Many retrieval systems are proposed for image content and information retrieval. However, the output of these approaches is generally not up to expectation. In this work, a new approach, remote sensing image retrieval scheme by content base image retrieval with grid computing and advanced database concepts are used and this will help to speed up both input processing and system response time. This paper presents the idea of the parallel processing of input data, queries, and storing images in the database using advanced database concept like B+ or BST trees.
    Keywords: two-dimensional multi-resolution hidden Markov model; 2-D MHMM; remote sensing images; semantic network.
    DOI: 10.1504/IJICT.2018.10010450
  • Intuitionistic fuzzy local binary pattern for features extraction
    by Mohd Dilshad Ansari, Satya Prakash Ghrera 
    Abstract: We propose a novel intuitionistic fuzzy feature extraction method to encode local texture. The proposed method extends the fuzzy local binary pattern approach by incorporating intuitionistic fuzzy set theory in the representation of local patterns of texture in images. Intuitionistic fuzzy local binary pattern also contribute to more than one bin in the distribution of the intuitionistic fuzzy local binary pattern values which can be used as a feature vector. The proposed intuitionistic fuzzy local binary pattern approach was experimentally evaluated for Lena image of the size 256 × 256. The results validate the effectiveness of the proposed intuitionistic fuzzy local binary pattern over the local binary pattern and fuzzy local binary pattern feature extraction methods.
    Keywords: fuzzy local binary pattern; FLBP; intuitionistic fuzzy sets; IFSs; intuitionistic fuzzy local binary pattern; IFLBP; entropy.
    DOI: 10.1504/IJICT.2018.10005094
  • A lossless image encryption algorithm using matrix transformations and XOR operation
    by Assia Beloucif, Lemnouar Noui 
    Abstract: Encryption is the way to ensure confidentiality of different data, digital images have special features as large data, bulky data, and strong correlation between pixels, which makes traditional encryption algorithms not suitable for image encryption. For this concern, we propose a novel lossless encryption scheme for digital images based on combination of matrix transformations and XOR operation. The numerical experimental results confirms that the proposed method achieves high security level against brute force attacks, statistical attacks and sensitivity analysis, moreover the suggested algorithm provides a good randomness properties, thus our method can be applied for image encryption and transmission in sensitive domains.
    Keywords: lossless image encryption; confidentiality; matrix transformations; multimedia security.
    DOI: 10.1504/IJICT.2018.10010447
  • Swarm intelligence algorithms in cryptanalysis of simple Feistel ciphers
    by Tahar Mekhaznia, Abdelmadjid Zidani 
    Abstract: Recent cryptosystems constitute a hard task for cryptanalysis algorithms due to the nonlinearity of their structure. This problem can be formulated as NP-hard. It has long been subject to various attacks; related results remain insufficient especially when handling wide instances due to resources requirement which increase with the size of the problem. On another side, heuristic methods are techniques able to investigate large spaces of candidate solutions. Swarm intelligence algorithms, part of heuristic methods represent a set of approaches characterised by their fast convergence and easy implementation. The purpose of this paper is to provide a detailed study about the performance of two swarm intelligence algorithms, BAT algorithm and wolf pack search (WPS) algorithm for cryptanalysis of some variant of Feistel ciphers. Experiments were accomplished in order to study the effectiveness of such algorithms in solving the considered problem. Moreover, a comparison with VMMAS, PSO and DE algorithms establishes this advantage.
    Keywords: cryptanalysis; Feistel ciphers; BAT algorithm; WPS algorithm; VMMAS algorithm; PSO algorithm; DE algorithm.
    DOI: 10.1504/IJICT.2018.10010452
  • Modelling UML state machines with FoCaLiZe
    by Messaoud Abbas, Choukri-Bey Ben-Yelles, Renaud Rioboo 
    Abstract: UML and OCL are largely adopted as a standard to describe the static and the dynamic aspects of systems and to specify their properties. Model driven engineering (MDE) techniques can be used to automatically generate code from such models. For critical systems, formal methods are frequently used together with UML/OCL models in order to analyse and check model properties. In this paper, we propose a combination of UML/OCL and FoCaLiZe, an object-oriented development environment using a proof-based formal approach. More specifically, we propose a formal transformation of UML state machines conditioned with OCL constraints into FoCaLiZe specifications that can be refined to generate executable code. The proposed transformation supports communication between a class structure and its state machine. Thanks to Zenon, the automatic theorem prover of FoCaLiZe, errors of the original UML/OCL model, if any, are automatically detected. We illustrate our translation with a concrete case study.
    Keywords: FoCaLiZe; UML; Object Constraint Language; OCL; state machine; model driven engineering; MDE; proof; Coq; Zenon.
    DOI: 10.1504/IJICT.2018.10010449