Bi-GRU model based on pooling and attention for text classification
by Yu-Lan Hu; Qing-Shan Zhao
International Journal of Wireless and Mobile Computing (IJWMC), Vol. 21, No. 1, 2021

Abstract: Aiming at the problems that most of the text classification models based on neural network are easy to overfit and ignore key words in sentences in the training process, an improved text classification model is proposed. The proposed model is a Bi-direction Gated Recurrent Unit (Bi-GRU) model based on pooling and attention mechanism. The model solves the above problems in the following ways. First, the bidirectional gated recurrent unit is used as the hidden layer to learn the deep semantic representation. Second, the max-pooling is adopted to extract text features and the self-attention mechanism is used to obtain information about the influence of words and sentences for text classification. Finally, the model uses the splicing results of the two to classify texts. The experimental results show that the proposed model is better than the Text-CNN model and Bi-GRU_CNN model such as precision, recall rate and F-score. Compared with the optimal model, the precision, recall rate and F-score are respectively increased by 5.9%, 5.8% and 4.6% for Fudan Set that is the longer text data set.

Online publication date: Fri, 19-Nov-2021

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Wireless and Mobile Computing (IJWMC):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com