You can view the full text of this article for free using the link below.

Title: Low-loss data compression using deep learning framework with attention-based autoencoder

Authors: S. Sriram; P. Chitra; V. Vijay Sankar; S. Abirami; S.J. Rethina Durai

Addresses: Thiagarajar College of Engineering, Madurai, Tamilnadu, India ' Thiagarajar College of Engineering, Madurai, Tamilnadu, India ' Thiagarajar College of Engineering, Madurai, Tamilnadu, India ' School of Computer Science and Engineering, Vellore Institute of Technology, Chennai, Tamil Nadu, India ' Thiagarajar College of Engineering, Madurai, Tamilnadu, India

Abstract: With rapid development of media, data compression plays a vital role in efficient data storage and transmission. Deep learning can help the research objective of compression by exploring its technical avenues to overcome the challenges faced by the traditional Windows archivers. The proposed work initially investigates multi-layer autoencoder models, which achieve higher compression rates than traditional Windows archivers but suffer from reconstruction loss. To address this, an attention layer is proposed in the autoencoder to reduce the difference between the encoder and decoder latent representation of an input along with the difference between the original input and reconstructed output. The proposed attention-based autoencoder is extensively evaluated on the atmospheric and oceanic data obtained from the Centre for Development of Advanced Computing (CDAC). The results show that the proposed model performs better with around 89.7% improved compression rate than traditional Windows archiver and 25% reduced reconstruction loss than multi-layer autoencoder.

Keywords: deep learning; multi-layer autoencoder; compression ratio; attention; reconstruction loss.

DOI: 10.1504/IJCSE.2023.129150

International Journal of Computational Science and Engineering, 2023 Vol.26 No.1, pp.90 - 100

Received: 30 Jul 2021
Accepted: 01 Nov 2021

Published online: 23 Feb 2023 *

Full-text access for editors Full-text access for subscribers Free access Comment on this article