An efficient speech perceptual hashing authentication algorithm based on DWT and symmetric ternary string Online publication date: Thu, 04-Jan-2018
by Zhang Qiuyu; Xing Pengfei; Huang Yibo; Dong Ruihong; Yang Zhongping
International Journal of Information and Communication Technology (IJICT), Vol. 12, No. 1/2, 2018
Abstract: According to the situation that speech perceptual hashing methods are not appropriated for real-time speech content authentication in mobile computing environment, a novel DWT-based perceptual hashing algorithm, which uses a combination of time-domain and frequency-domain features, was proposed to protect the speech data in the cloud. Firstly, by discrete wavelet transform (DWT), a new signal in frequency-domain is generated from the original speech signal after pre-processing and intensity-loudness transform (ILT). Secondly, coefficients of low frequency wavelet decomposition are partitioned into equal-sized and non-overlapping blocks, and logarithmic short-time energy of each block is computed to obtain speech signals features in frequency-domain. Finally, combined with spectral flux features (SFF) of speech signal in time-domain, a ternary perceptual hashing sequence is created. Experiment results illustrate that ternary form is better to stand for hash digest than binary form, the proposed algorithm has a good robustness against content preserving operations, discrimination, good compaction and high efficiency, and detects the tamper localisation as well.
Online publication date: Thu, 04-Jan-2018
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Information and Communication Technology (IJICT):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email firstname.lastname@example.org