Title: Bi-GRU model based on pooling and attention for text classification

Authors: Yu-Lan Hu; Qing-Shan Zhao

Addresses: Department of Computer Science and Technology, Xinzhou Teachers University, Xinzhou, Shanxi Province, China ' Department of Computer Science and Technology, Xinzhou Teachers University, Xinzhou, Shanxi Province, China

Abstract: Aiming at the problems that most of the text classification models based on neural network are easy to overfit and ignore key words in sentences in the training process, an improved text classification model is proposed. The proposed model is a Bi-direction Gated Recurrent Unit (Bi-GRU) model based on pooling and attention mechanism. The model solves the above problems in the following ways. First, the bidirectional gated recurrent unit is used as the hidden layer to learn the deep semantic representation. Second, the max-pooling is adopted to extract text features and the self-attention mechanism is used to obtain information about the influence of words and sentences for text classification. Finally, the model uses the splicing results of the two to classify texts. The experimental results show that the proposed model is better than the Text-CNN model and Bi-GRU_CNN model such as precision, recall rate and F-score. Compared with the optimal model, the precision, recall rate and F-score are respectively increased by 5.9%, 5.8% and 4.6% for Fudan Set that is the longer text data set.

Keywords: text classification; bi-direction gated recurrent unit; max pooling; self-attention mechanism.

DOI: 10.1504/IJWMC.2021.119057

International Journal of Wireless and Mobile Computing, 2021 Vol.21 No.1, pp.26 - 31

Received: 11 Oct 2020
Accepted: 16 Dec 2020

Published online: 13 Nov 2021 *

Full-text access for editors Access for subscribers Purchase this article Comment on this article