Title: Text similarity semantic calculation based on deep reinforcement learning

Authors: Guanlin Chen; Xiaolong Shi; Moke Chen; Liang Zhou

Addresses: School of Computer and Computing Science, Zhejiang University City College, Hangzhou, 310015, China; College of Computer Science and Technology, Zhejiang University, Hangzhou, 310027, China ' School of Computer and Computing Science, Zhejiang University City College, Hangzhou, 310015, China; College of Computer Science and Technology, Zhejiang University, Hangzhou, 310027, China ' E-Government Office of Hangzhou Municipal People's Government, Hangzhou, 310020, China ' E-Government Office of Hangzhou Municipal People's Government, Hangzhou, 310020, China

Abstract: Semantic analysis is a fundamental technology in natural language processing. Semantic similarity calculations are involved in many applications of natural language processing, such as QA system, machine translation, text similarity calculation, text classification, information extraction and even speed recognition, etc. This paper proposes a new framework for computing semantic similarity: deep reinforcement learning for Siamese attention structure model (DRSASM). The model learns word segmentation automatically and word distillation automatically through reinforcement learning. The overall architecture LSTM network to extract semantic features, and then introduces a new attention mechanism model to enhance semantics. The experiment show that this new model on the SNLI dataset and Chinese business dataset can improve the accuracy compared to current base line structure models.

Keywords: big data; machine learning; deep learning; natural language processing; semantic similarity; semantic computing; reinforcement learning; attention model; LSTM model.

DOI: 10.1504/IJSN.2020.106526

International Journal of Security and Networks, 2020 Vol.15 No.1, pp.59 - 66

Received: 17 May 2019
Accepted: 19 Jun 2019

Published online: 09 Apr 2020 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article