International Journal of Electronic Security and Digital Forensics (29 papers in press)
Implementation of RFID Mutual Authentication Protocol
by Sivasankaran Kumaravel, Ashik JOJI
Abstract: RFID (Radio Frequency Identification): The most flexible auto identification technology has a dereliction in its security. Over the years researchers have worked on the security issue of the long established commonly used Passive UHF RFID tags and have come up with some authentication protocols scorning its hardware implementation. Here a lightweight mutual authentication protocol is implemented in ASIC based on the EPC Class 1 Generation 2 framework released by EPC global, which is the widely used industrial standard for passive UHF RFID communication. We have proposed to incorporate ROM to store message signal, which shows significant reduction in area and power as compared to existing digital baseband architecture.
Keywords: EPC; security; RFID; LFSR; lightweight; authentication; VLSI; pierndecoder; fm0 encoder.
Attribute-Based Encryption Supporting Data Filtration over Post-Quantum Assumptions
by Jiao Chunhong
Abstract: As the internet becomes prevalent, plenty of sensitive data is being transferred in open networks environment. It is worth concerning how to achieve efficient data transfer in a privacy-preserving manner. Although attribute-based encryption(ABE) can achieve fine-grained access control over encrypted data, it still could not work for restricting unauthorized user to access. In this paper, we introduce a new cryptographic primitive called attribute-based encryption supporting data filtration(ABE-SDF), and formalize the security mode by incorporating the advantages into previous ABE. Finally, we present an efficient construction of the scheme over post-quantum assumptions, our scheme is believed to be quantum-resistant owing to the special property of lattices. Based on the LWE assumption, we prove that the proposed scheme has the indistinguishability against selective chosen plaintext attacks and the authentication information security.
Keywords: ABE; Data Transfer; LWE; Post-Quantum.
Video Steganalysis to Obstruct Criminal Activities for Digital Forensics: A Survey
by Mukesh Dalal, Mamta Juneja
Abstract: Steganography is the method to hide information in a carrier whereas steganalysis is the procedure to discern the presence of the information hidden in a carrier. Steganography is used for secure communication but these techniques can also be used by terrorists or criminals for camouflage communications. So the techniques for steganalysis are becoming more significant nowadays. The significance of steganalysis techniques that can precisely detect the existence of secret data in a video is increasing nowadays as there is evidence found that terrorist groups are using video steganography to communicate. So, for national security, it is required to gather adequate evidence of the existence of secret data embedded and interrupt the communication. This paper intends to present some of the evidence of the use of steganography by terrorists and criminals with a survey of existing video steganalysis techniques and also discuss some of the open challenges in this field.
Keywords: Steganography; video steganalysis; spatial domain; transform domain; compression; motion vector; motion estimation; inter-frame prediction; intra-frame prediction; classifier.
A Comparative Forensic Analysis of Privacy Enhanced Web Browsers and Private Browsing Modes of Common Web Browsers
by Ryan Gabet, Kathryn Seigfried-Spellar, Marcus Rogers
Abstract: Growing concerns regarding internet privacy has led to the development of enhanced privacy web browsers. The authors conducted a digital forensic examination, to determine the recoverable artifacts, of three enhanced privacy web browsers (Dooble, Comodo Dragon, Epic) and three commonly used web browsers in anonymous browsing mode (Chrome, Edge, and Firefox). In addition, the authors compared two digital forensic tools (FTK, Autopsy) commonly used by law enforcement to determine differences in recoverable browser artifacts. Results indicated the enhanced privacy browsers performed about the same as the common browsers in anonymous browsing mode. In addition, FTK was the better tool for recovering and viewing browser artifacts for both browser groups. Overall, this study did not produce sufficient evidence to conclude that enhanced privacy browsers do indeed provide better privacy.
Keywords: Privacy Browsers; Internet Artifacts; Digital Forensics; FTK; Anonymous.
Optimized Elliptic Curve Digital Signature on NIST Compliant Curves for Authentication of MANET Nodes
by Raj Kamal Kapur, Sunil Kumar Khatri, Lalit Mohan Patnaik
Abstract: Secure routing protocols for Mobile Ad hoc Networks (MANETs) use digital signatures based on Rivest, Shamir and Adleman (RSA), for authentication of routing messages which increases computational and communication overheads. Elliptical Curve Digital Signature (ECDSA) on the other hand uses much shorter keys to provide the same level of security as that of RSA. This results in smaller signatures, lower computational load, less memory and power requirements which are crucial to MANET nodes. The ECDSA however has a characteristic that the signature generation is very fast as compared to that of RSA algorithm but the verification of the signature takes much longer time due to complex arithmetic operations in the underlying finite prime field. Optimization of point operations and scalar multiplications operation have been proposed for accelerating the key generation, signature generation and verification processes. The acceleration of the signature verification process has also been proposed by carrying out simultaneous multiplication of two points using Joint Sparse Form (JSF) of scalars. It has been compared with verification process of ECDSA signatures using sequential mixed Jacobian-Affine wNAF scalar multiplication method. The proposed algorithm has been software implemented by writing the code in Java using Biginteger class on a Linux platform for National Institute of Standards and Technology (NIST) compliant curves. The proposed composite method has accelerated the signature verification process of ECDSA by approximately 27% over the sequential mixed Jacobian-Affine wNAF scalar multiplication method of verification.
Keywords: ECDSA; Elliptic Curve; MANET; Digital Signature; Node Authentication; Secure Routing Protocol.
A New Diffusion and Substitution based Cryptosystem for Securing Medical Image Applications
by Mancy Lovidhas, Maria Celestin Vigila
Abstract: Due to the rising privilege for tele-health facilities have sophisticated responsiveness in the usage of medicinal image safeguard proficiency. It mainly compact with patient records that are secretive and must only available to legal person. So the medical image safety becomes a very significant problem, when patient evidence is conveyed through the public network. In this paper, a secret key of 128-bits size is generated by an image histogram. Initially, the photo sensitive feature of Digital Imaging and Communications in Medicine image is decomposed by the mixing process. The resulting image is distributed in key reliant blocks and further, these blocks are passed through key reliant diffusion and substitution processes. Total five rounds are used in the encryption method. Finally the generated secret key is embedded within the encrypted image in the process of steganography. This also enhances the security of proposed cipher. At the receiver side the secret key was recovered from the embedded image and decryption operation was performed in inverse format. Performance analysis designates that the proposed cipher is more secure.
Keywords: Diffusion; Substitution; Histogram; Encryption; Steganography.
Energy Deviation Measure: A Technique for digital Image Forensics
by Surbhi Gupta, Neeraj Mohan, Parvinder Singh Sandhu
Abstract: Digital image forgery and its forensics have emerged as a significant research domain. Digital forensics is required to examine the questioned images and classify them as authentic or tampered. This paper aims at image tamper detection using a novel Energy Deviation Measure (EDM).The EDM is a measure of deviation in pixel intensity with respect to its immediate and distant neighbourhood. It is extracted by measuring the interpixel intensity difference across and inside the DCT block boundary of a JPEG image. Features from EDM have been used for the classification of the authentic and tampered images. Support Vector Machine is used for image classification. The experimental results have shown that the proposed method performs better with fewer dimensions as compared to other state of the art methods. It gives improved accuracy and area under curve while classifying images. It is robust to noise and JPEG image compression quality factor.
Keywords: Energy Deviation Measure; Image tampering; Copy Move forgery; Image splicing; Image forensics; Compression artifacts.
ArMTFr: A New Permutation-Based Image
by Hassan Elkamchouchi, Wessam Salama, Yasmine Abouelseoud
Abstract: In this paper, a new image encryption scheme
named (ArMTFr) is proposed. An image is encrypted using a
combination of keyed permutations and substitution, where a
fractal is XORed with the scrambled image. Fractal images are
employed in order to improve the performance of the encryption
scheme from the viewpoint of randomization and to increase the
encryption key space, thus boosting its security. The employed
permutations are the Arnold map and Mersenne-Twister's
permutation algorithm. Before the encryption process starts,
histogram equalization is used to enhance the contrast of the
image by transforming the intensity values in it, so that the
histogram of the output image approximately matches a uniform
histogram. First, grayscale images are considered and then the
basic algorithm is extended to handle colored images. Three
representations for colored images are considered: RGB, YCbCr
and HSI color spaces. The security of the algorithm is enhanced
in this case by applying RGB color channels multiplexing. The
experimental results show that the encrypted image has low
correlation coefficients among adjacent pixels and a good
histogram distribution, as well as resistance to various attacks.
Keywords: Correlation; Image Encryption; Histogram Equalization; Pixel Permutation; Arnold Map; Fractals.
A New Scheme of Preserving User Privacy for Location-Based Service
by Xiaojuan Chen, Huiwen Deng
Abstract: Individual privacy has been a great concern to users who need the location based service by networked devices such as smart phones and personal computers. Usually, the provider who can provide a location based service is regarded as semi-trusted or honest-but-curious. It leads to tremendous harmfulness for users who request this service because the dishonest service provider leaks the users's personal information. To preserve user privacy, We propose a scheme which achieves user privacy information including location, identity, and domain, while the user can still obtain the required service from a service provider. For the sake of less computational time and minimal computer power, only symmetric key cryptography is employed in our system. This scheme is secure by our security analysis, and is feasible through our imitating implementation. Compared with related schemes, our scheme can provide sufficient property to meet our requirements. To the best of our knowledge, this is the first privacy preserving scheme for all privacy information of a user rather than the location privacy only as the previous literatures.
Keywords: Preserving Privacy; Security; Confidentiality.
Drone Forensics: Examination and Analysis
by Farkhund Iqbal, Benjamin Yankson, Babar Shah, Maryam Ahmed AlYammahi, Naeema Saeed AlMansoori, Suaad Mohammed Qayed, Thar Baker
Abstract: Unmanned Aerial Vehicles (UAVs), also known as drones, provides unique functionalities, which allows area surveillance, Inspection, surveying, unarmed cargo, armed attack machines, and aerial photography. Although drones have been around for sometimes, mass adoption of this technology is new. The technology is widely adopted in fields including law enforcement, cartography, agriculture, disaster monitoring, and science research. Due to vulnerabilities, and the lack of stringent security implementation, drones are susceptible to GPS spoofing attacks, integrity attacks and de-authentication attacks. These attacks which can allow criminals to access data, intercept the drone and, and use it commit a crime and complicate forensic investigation. The need for standardized drone forensics is imperative in order to help identify vulnerabilities in different models of drones, solve drone related crime, and enhance security; thwarting any anti-forensic measure by criminals. Thus, this paper is presented to report on potential attacks against the Parrot Bebop 2 drone, and the ability for an investigator to collect evidence about the attacks on the drone. This paper aims at examining the possibility of establishing ownership and collecting data to reconstruct events, linking the drone controller with the drone to prove ownership, flight origins and other potentially useful information necessary to identify the proprietor of a crime. In addition, we have also proposed a small-scale drone ontology for modeling drone context data, and simple forensic processing framework for small-scale drones.
Keywords: digital forensics; investigation; drone security; drone attack; context data; drone ontology.
IMPLEMENTATION OF THE PREDECI MODEL IN THE PROSECUTION OF CHIMBORAZO IN ECUADOR: A CASE STUDY EVALUATION.
by Fernando T. Molina Granja, Glen D. Rodriguez Rafael, Raul Marcelo Lozada Yanez, Edmundo Bolivar Cabezas Heredia
Abstract: The model to evaluate is a model for the preservation of digital evidence-based institutions of criminal investigations where it is essential to preserve evidence that has characteristics of the environment with the purpose of increasing the rate of admissibility of the evidence in court. This article aims to evaluate the model and its impact in terms of security, admissibility, and long-term preservation characteristics. We respond to the following research question: Does the model, implemented in an software application for a case study, raise the admissibility of digital evidence in court?. Thus, a software application is developed, the unit of study is defined, and the results are analyzed. The study determined that the model, when implemented properly and following the guidance of implementation of the model, raises the admissibility of digital evidence in court.
Keywords: PREDECI; assessment models; admissibility; digital evidence; guide implementation.
Combating credit card fraud with online behavioral targeting and device fingerprinting
by Othusitse Seth Dylan Phefo
Abstract: Billions of dollars are lost due to credit/debit card fraud every year. This trend has been going up despite the evolution of several fraud detection techniques that are applied to many business fields to try and stem the tide. Fraud detection involves, among other things, the monitoring of customers' credit card usage patterns in order to notice any changes that might reflect fraud and use such information to stop the transaction before any loss is realized, or to inform the customer of suspicious activity in their accounts. There are many existing fraud detection techniques employed by card issuers and researchers, but they seem not able to stem the tide. Online advertising companies employ a number of groundbreaking technologies to send targeted advertising to internet users among them Online Behavioral Targeting (OBT) and Device Fingerprinting (DF). These technologies are able to track and profile internet users up to the level of what device they are using and what they are most likely to purchase. In this paper we propose a novel Fraud detection framework that uses Online Behavioral Targeting (OBT) Data and Device Fingerprinting (DF) to improve the efficiency of an existing Fraud Detection System (i.e. the fusion approach using Dempster-Shafer theory and Bayesian learning). OBT and DF provide massive insights into our online behavior and can be used to pinpoint fraudsters as well as know shopping patterns of credit card users.
Keywords: Fraud Detection; Security; Information Security; Dempster-ShaferrnAdder; Behavioral Targeting.
Information Security Model using Data Embedding Technique for Enhancing Perceptibility and Robustness
by Sunil Moon
Abstract: Information concealing using steganography is simple but to maintain its security, perceptibility, robustness, embedding capacity and good recovery of both cover as well as secret data are the major issues. This paper is focused on the improvement in all these major issues. The proposed technique embedded the secret image and audio as secret data into the randomly selected frames of video using Multi Frame Exploiting Modification Direction (MFEMD) algorithm. Hence it is very difficult to understand in which part of video, data is hidden. At the receiver end we have used the forensic tool for authentication to improve data security. Our simulation results are found to be better than any other existing methods in terms of Peak Signal to Noise Ratio (PSNR), Mean Square Error (MSE), Correlation Factor (C.F), good visual recovery of both original video and secret data, hiding capacity of secret data, security of secret data. Different types of attacks are applied on stego video during transmission like visual, chi-square, histogram, etc. to improve the perceptibility and robustness of secret data.
Keywords: MFEMD; Audio Video Crypto-Steganography; Information security; CF; Attacks.
A Novel Median Filtering Forensics Based on Principal Component Analysis Network
by Xian Wang, Bing-Zhao Li
Abstract: As an important issue of forensic analysis, median filtering detection has drawn much attention in the decade. While several median filtering forensic methods have been proposed, they may face trouble when detecting median filtering on low-resolution or compressed images. In addition, the existing median filtering forensic methods mainly depend on the manually selected features, which makes these methods may not adapt to varieties of data. To solve these problems, convolution neural networks have been applied to learn features from the training database automatically. But the CNN-based method trains slowly and the parameters of it is hard to select. Thus, we proposed a PCANet-based method. And we test our trained model on several databases. The simulation shows that our proposed method achieves better performance, and trains much faster than CNN-based method.
Keywords: median filtering; blind forensics; principal component analysis; neural network.
A Novel Authentication Scheme for Anonymity and Digital Rights Management Based on Elliptic Curve Cryptography
by Cheng-Chi Lee, Chun-Ta Li, Zhi-Wei Chen, Shun-Der Chen, Yan-Ming Lai
Abstract: Due to the rapid development of computer science and associated technologies, various text documents, multimedia data, software and many other forms of contents are now created, stored, and processed digitally, and almost all traditional contents of special value such as paper documents, music or video tapes, and a lot more, if possible, have also been digitized and managed digitally. As the Internet makes data transmission easy and fast, digital contents of all kinds can be spread all over the world at a shocking speed. Along with such amazing swiftness and convenience, however, modern computer and communication technologies have also brought various kinds of issues associated with digital rights management. Digital rights management (DRM) systems are access control technologies used to restrict the use, modification, and distribution of proprietary hardware and copyrighted works. Now, in view of modern peoples heavy dependence on their mobile devices, we consider it a good idea to design a DRM scheme on the basis of elliptic curve cryptography (ECC) because ECC is a very good mobile device level security tool. In this paper, we shall review Amin et al.s 2016 scheme and point out some security weaknesses we have found. Then, with the security flaws mended, we shall propose an improved ECC-based protocol for DRM that is especially suitable for applications on mobile devices.
Keywords: Biometric; Digital rights management; ECC; Mobile device; User’s anonymity.
Malware Detection Model Based on Classifying System Calls and Code Attributes: A proof of Concept
by Malik Saleh
Abstract: The process of malware detection involves static code analysis and dynamic analysis. Both methods have limitations. This research tried to bridge the gap between the two methods by dynamically predicting the risk before the static analysis. The proof-of-concept examined the code of known malwares and concluded that five characteristics of the code will predict the risk of any executable file, namely, the system function, encryption, code obfuscation, stalling code, and checking for the debugger environment. The proof-of-concept validates the effectiveness of the model. It shows 96 percent success and limited false-positives results.
Keywords: Malware; Malware detection; System Calls; Classifying system calls; static analysis; dynamic analysis.
Evaluation of Smartphone Data using a Reference Architecture
by Heloise Pieterse, Martin Olivier, Renier Van Heerden
Abstract: The 21st century is continuously witnessing the growth and evolution of smartphone technology. Central to this evolution is the use of popular smartphone applications. The frequent use of smartphone applications by people for everyday activities allows for the creation and storage of large quantities of smartphone data. Smartphone data is susceptible to change and can be compromised by anti-forensic tools, malware or malicious users. It is, therefore, important to establish the authenticity of such data before forming any conclusions. The first step to establishing the authenticity of smartphone data is to acquire a better understanding of the expected behaviour of smartphone applications. This paper introduces a reference architecture for smartphone applications, which captures the architectural components and models the expected behaviour of smartphone applications. An experiment conducted to examine the smartphone data of Androids default messaging application indicates that the reference architecture can assist digital forensic professionals in identifying authentic smartphone data.
Keywords: Digital Forensics; Smartphone Forensics; Smartphones; Authenticity; Reference Architecture; Android; iOS; Applications.
A Road Map for Digital Forensics Research: A Novel Approach for Establishing the Design Science Research Process in Digital Forensics
by Reza Montasari, Victoria Carpenter, Richard Hill
Abstract: Compared to other well-established scientific fields such as Computer Science (CS) or Information Security (IS), Computer Forensics (CF) is still evolving as a new scientific field. As a result of such an evolution, CF still lacks standardisation in various aspects including, but not limited to, process models, datasets, procedures, techniques, as well as formal research methodologies. As a result, progress in the establishment of CF as a scientific field has been hindered. Such a lack of standardisation has prompted debates on the scientific credentials of CF. This paper aims to address one of such issues concerning the lack of standardisation, namely the absence of formal research methods in CF. Our paper has been motivated by the awareness that much of studies to date in CF has focused on the applied research at the expense of theoretical aspects such as formal research methodologies that are urgently needed to advance research in digital forensics. Therefore, this study adds to the body of knowledge by filling the gap that there does not currently exist a well-established research methodology in CF. To this end, we borrow a well-established research methodology from the domain of IS, namely Peffers et al.s (2006), adapt and extend it and make it relevant to research studies in CF. We will demonstrate how each phase of the DSRP can be applied to different stages of a CF research. This study sets a precedent for other researchers to identify, adapt, extend and apply other well-established research methods to studies in CF.
Keywords: computer forensics; design science research; research methodology; digital investigations; information system; digital forensics.
Comparison Analysis of Electricity Theft Detection Methods for Advanced Metering Infrastructure in Smart Grid
by Hamed Barzamini, Mona Ghassemian
Abstract: While smart grid technologies are deployed to help achieve improved grid reliability and efficiency, they are vulnerable to cyber-attacks which can result in billions of dollars loss for energy companies. The appropriate classification method selection to detect the electricity theft is under the influence of operational requirements and resource constraints in real scenarios. Since unsupervised methods have a high error rate, we investigate a new application based on a semi- supervised anomaly detection method which uses the principal component analysis (PCA) technique to detect the electricity theft. The performance of this method is compared with the peer-to-peer (P2P) method based on linear equations. The P2P method assumes that the electricity theft occurs in a particular situation. Our evaluations indicate that in the absence of this assumption, the P2P method detection system results in 100% false alarm. While the anomaly detection method using the PCA does not require any prior assumptions about the pattern of the electricity theft, it can retain its performance with a 4% false alarm rate. Our analysis shows an average of 45% improvement in the detection accuracy rate in comparison with the P2P method.
Keywords: smart grid; electricity theft; classification method; principal component analysis.
A Novel LSB Based RDH with Dual Embedding for Encrypted Images
by Debabala Swain, Jayanta Mondal, Devee D. Panda
Abstract: A novel reversible data hiding technique for encrypted images is proposed in this paper. Encryption helps to achieve privacy which is a necessity for sensitive imagery such as medical and military images. In encrypted domain data embedding capacity remains a big challenge. A dual embedding scheme is proposed to enhance the additional data hiding capability. The general architecture includes a content owner, a data hider, and a receiver. This scheme is subjected to work on 512
Keywords: Reversible data hiding; image encryption; least significant bit; dual embedding.
Fingerprint authentication based on fuzzy extractor in the mobile device
by Li Li, Siqin Zhou, Hang Tu
Abstract: Bio-cryptography is the combination of biometrics and cryptography that is a new security technology. For the fuzzy of fingerprint, fuzzy extractor that is a good model to protect the biometric data and can reliably extract almost the same random keys R from the closest input. however, many experiments about fuzzy extractors base on computer, we implement an application for fingerprint authentication in mobile devices based on the fuzzy extractor, the help data we need to store is in the capacity of the mobile extern storage. Unlike previous work, the construction of the input in secure sketch is very simple and uses ISO IEC 19794-2 standard minutia data. Most importantly, the scheme can be more secure to protect the biometric template.
Keywords: fingerprint authentication; fuzzy extractor; bch; android application.
Countermeasures for Timing-Based Side-Channel Attacks against Shared, Modern Computing Hardware
by Reza Montasari, Richard Hill, Amin Hosseinian-Far, Farshad Montaseri
Abstract: There are several vulnerabilities in computing systems hardware that can be exploited by attackers to carry out devastating Microarchitectural Timing-Based Side-Channel Attacks against these systems and as a result compromise the security of the users of such systems. By exploiting Microarchitectural resources, adversaries can potentially launch different variants of Timing Attacks, for instance, to leak sensitive information through timing. In view of these security threats against computing hardware, in a recent study, titled Are Timing-Based Side-Channel Attacks Feasible in Shared, Modern Computing Hardware?, currently undergoing the review process, we presented and analysed several such attacks. This extended study proceeds to build upon our recent study in question. To this end, we analyse the existing countermeasures against Timing Attacks and propose new strategies in dealing with such attacks.
Keywords: side channels; timing attacks; hardware attacks; channel attacks; digital investigations; countermeasures.
Secure and efficient authentication scheme for access control in mobile pay-TV systems
by Jingsong Cui, Hang Tu
Abstract: The increasing ubiquity and use of mobile devices enable them to access television programs through mobile pay-TV (MPTV) systems. To achieve secure communication in MPTV systems, authentication schemes for access control in these systems are needed. Recently, a one-to-many authentication (OTMA) scheme that guarantees secure communication in MPTV systems was proposed. However, it was found by other researchers that such a scheme could not resist the impersonation attack and could not provide mutual authentication. As a result, a new OTMA scheme was proposed and it was claimed that OTMA could solve the security weaknesses of the OTMA scheme. We demonstrate that the new OTMA scheme cannot resist the impersonation attack and cannot provide mutual authentication. To mitigate these major security weaknesses, we propose a new OTMA scheme. A security analysis of our proposed OTMA scheme demonstrates that it can overcome the security weaknesses of the previous OTMA scheme and improve its performance.
Keywords: authentication; anonymity; impersonation attack; mobile pay-TV; MPTV; security.
LSB based audio steganography preserving minimum sample SNR
by Mohammed A. Nasrullah
Abstract: Steganography is the art of hiding a secret data in another data. Audio steganography is a technique for hiding information in an audio signal. One of its methods is least significant bit (LSB) coding. The proposed system is embedding bits in LSBs while keeping the minimum signal to noise ratio (SNR) for each sample as required. This method gives a maximum number of embedding bits as possible, and increase the security by keeping secret the minimum required sample SNR. Also the length of the required audio carrier to embed the message changes according to the minimum required sample SNR.
Keywords: audio steganography; data hiding; least significant bit; LSB; signal to noise ratio; SNR; minimum sample SNR.
Special Issue on: CCC 2016 Cybersecurity in the Connected World
Black hole attack evaluation for AODV and AOMDV routing protocols
by Abdelwadood Mesleh
Abstract: Protecting a mobile ad hoc network (MANET) from malicious attacks is a challenging security issue, many of those attacks are reported on ad hoc on demand distance vector (AODV) and ad hoc on-demand multipath distance vector (AOMDV) routing protocols. Black hole attack (BHA) is a serious attack, in which wireless packets are redirected to a fake mobile node (MN), actually, a fake MN attacks other MNs as it presents itself in such a way that has the shortest path. This paper studies the impact of BHA on the performance of AODV and MAODV in terms of throughput, end-to-end delay and packet delivery ratio using network simulator version 2 (NS-2) and compares the resiliency of these routing protocols against BHAs. Simulation results revealed that AOMDV is more resilient against BHAs as it is able to easily find alternative routes to destination MNs.
Keywords: black hole attack; BHA; ad hoc on demand distance vector; AODV; ad hoc on-demand multipath distance vector; AOMDV; mobile ad hoc networks; MANETs; ad hoc network security.
Information security model using decision tree for Jordanian public sector
by Omar Suleiman Arabeyyat
Abstract: The rapid evolution of technology has created new services and introduced changes to the traditional style of delivery. Organisations are trying to adapt e-services to reduce cost and enhance the quality of their e-services. The Jordanian Government has introduced an e-government model, but a major obstacle has interrupted their introduction, specifically, with its information security system (ISS). While the development of the model happened in a rapid manner, the government's implementation and management of the laws and regulations did not happen in the same speed. Hence, this study investigates and builds a security model for (ISS) for the Jordanian public sector and investigates the effect of the implementation of King Abdullah II model for excellence on public sector to build security culture and awareness, the study uses a decision tree iterative dichotomiser 3 (ID3) classifier. The study concludes that following best practice and security policy are the main factors that drive the performance of the security model.
Keywords: artificial intelligence; decision tree ID3; information security; leadership; public sector; awareness and training; best practice; security policy.
Should we be afraid of cyber-terrorism?
by Julian Droogan, Lise Waldek
Abstract: This article explores the extent to which we should fear cyber-terrorism through providing a review of scholarship and debates over the nature of cyber-terrorism, in particular speculation about its future affordances. It questions whether terrorists have ever really been able to weaponise the internet much beyond using it as an effective communication tool, thus greatly reducing the likelihood of direct internet facilitated terrorism. First, the history of warnings regarding the imminent threat posed by the internet of becoming weaponised is presented, even though these warnings have tended to fail to materialise into reality. It is argued that speculations by individuals within the academic and policy community have failed to be born out in practice largely because the internet has instead been used less as a weapon by terrorists and more as a sophisticated communication tool. It continues by posing a series of questions regarding online audiences that are in need of future research if we are to better understand the role of the internet in spreading and supporting violent extremist discourse and cultivating terrorism. The most important question involves a better understanding of the role of audiences as autonomous agents in navigating, reacting and responding to online violent extremist materials.
Keywords: cyber-terrorism; online radicalisation; audience reception theory.
Disclosure of cyber security vulnerabilities: time series modelling
by MingJian Tang, Mamoun Alazab, Yuxiu Luo, Matthew Donlon
Abstract: Cybercriminal use of the internet continues to grow and poses a serious threat to individuals, businesses and governments. Software vulnerabilities represent a main cause of cybersecurity problems. Every day security engineers deal with a flow of cyber security incidents that are increasing. Effective management of software vulnerabilities is imperative for modern organisations regardless of their size. However, the vulnerability management processes tend to be more reactive in nature; relying on the publication of vulnerabilities, creation of signatures, and the scanning and detection process before control mitigations can be put into place. A forecasting model of the anticipated volume of future disclosures that leverages the rich historical vulnerability data will provide important insights help develop strategies for the proactive management of vulnerabilities. This study is the first to discover the existence of volatility clustering in the vulnerability disclosure trend. Through our novel framework for statistically analysing long-term vulnerability disclosures between January 1999 and January 2016, the result shows that our model can predict the likelihood that software contains yet to be discovered vulnerabilities and be exposed to future threats such as zero-day attacks. Such knowledge could be potentially an important first step in crime detection and prevention and improve security practices.
Keywords: cyber security; cybercrime; risk analysis; vulnerability disclosure; volatility; generalised autoregressive conditional heteroskedasticity; time series.
A security framework for node-to-node communications based on the LISP architecture
by Mohammad Muneer Kallash, Jonathan Loo, Aboubaker Lasebae, Mahdi Aiash
Abstract: The locator/ID separation protocol (LISP) is a routing architecture that provides new semantics for IP addressing to support communications between peripheral networks of different technologies. Securing the LISP architecture has been investigated in the literature, while securing communications in peripheral networks is left to individual technologies. The authors in this paper advocate the need for a comprehensive solution to secure communications based on LISP. Therefore, the paper introduces a new node-to-node authentication and key agreement protocol. The protocol is formally verified using formal method based on Casper/FDR. Furthermore, the paper demonstrates how to integrate the proposed protocol with existing LISP's security mechanisms in the form of a security framework.
Keywords: authentication and key agreement protocols; node-to-node; formal verification; location/ID split protocol.