You can view the full text of this article for free using the link below.

Title: Self-attention based sentiment analysis with effective embedding techniques

Authors: Soubraylu Sivakumar; Ratnavel Rajalakshmi

Addresses: School of Computer Science and Engineering, Vellore Institute of Technology, Chennai, Tamil Nadu, India ' School of Computer Science and Engineering, Vellore Institute of Technology, Chennai, Tamil Nadu, India

Abstract: The problem of sparse vector representation exists in handling the large-scale data and also semantics of words are not considered in many existing works of sentiment analysis. Effective word embedding techniques improve the task of sentiment analysis by overcoming the above two problems. It is a challenging task, when the review is expressed in multiple sentences and the entire sentence needs to be considered instead of individual words to determine the sentiment. To achieve this task, we have proposed a novel LSTM-based deep learning architecture that applies a sentence embedding using universal sentence encoder along with an attention layer. To evaluate the proposed approach, we have carried out various experiments on IMDB data set. From the experimental results, it is observed that the proposed method is significantly better than the other word-embedding-based approaches with an improvement of 5% and we have achieved a F1 score of 89.12% for our approach.

Keywords: sentiment analysis; universal sentence encoder; long short term memory; word embedding; GloVe; self-attention.

DOI: 10.1504/IJCAT.2021.113651

International Journal of Computer Applications in Technology, 2021 Vol.65 No.1, pp.65 - 77

Received: 18 Jul 2019
Accepted: 30 May 2020

Published online: 06 Mar 2021 *

Full-text access for editors Access for subscribers Free access Comment on this article