Selective secure error correction on SPIHT coefficients for pervasive wireless visual network
by Jia Jan Ong; Li-Minn Ang; Kah Phooi Seng
International Journal of Ad Hoc and Ubiquitous Computing (IJAHUC), Vol. 13, No. 2, 2013

Abstract: This paper presents a secure error correction architecture with data compression using SPIHT algorithm for implementation in the wireless visual network (WVN). By using Cauchy Reed Solomon (CRS) coding scheme, the mapping bits produced from the SPIHT algorithm will be protected from errors and any adversary attack during the wireless transmission. In order to meet the hardware constraints of WVN, the CRS architecture is developed based on a very low complexity structure employing a minimal instruction set computing (MISC) architecture. For this, a particular configuration of the CRS MISC is chosen that has the similar key space security as provided by the Advance Encryption Standard (AES). Both SPIHT MIPS and CRS MISC architectures are implemented onto an FPGA, resulting in only 5017 slices and 82 block RAMs on the Xilinx Virtex 2, which demonstrate the feasibility for implementation in low complexity sensor nodes.

Online publication date: Tue, 28-May-2013

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Ad Hoc and Ubiquitous Computing (IJAHUC):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com