Multi-applicable text classification based on deep neural network
by Jingjing Yang; Feng Deng; Suhuan Lv; Rui Wang; Qi Guo; Zongchun Kou; Shiqiang Chen
International Journal of Sensor Networks (IJSNET), Vol. 40, No. 4, 2022

Abstract: Most long text classification methods based on deep learning have problems such as semantics sparsity and long-distance dependence. To tackle these problems, a novel multi-applicable text classification based on deep neural network (MTDNN) is proposed, which contains a bidirectional encoder representation from transformer (BERT), a dimension reduction layer, and the bidirectional long short-term memory (Bi-LSTM) combining attention mechanism. BERT is used to pre-train the words into the word embedding vectors. The dimension reduction layer extracts the feature phrase representations with higher weight from the word embedding vectors. The Bi-LSTM captures both the forward and backward context representations. An attention mechanism is employed to focus on the information outputted from the Bi-LSTM. The experimental results illustrate that the accuracy of the MTDNN for long text, short text classification, and sentiment analysis reaches 94.95%, 93.53% and 92.32%, respectively. The results show that our method outperforms the other state-of-the-art text classification methods.

Online publication date: Mon, 19-Dec-2022

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Sensor Networks (IJSNET):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com