Title: A method of automatic text summarisation based on long short-term memory

Authors: Wei Fang; TianXiao Jiang; Ke Jiang; Feihong Zhang; Yewen Ding; Jack Sheng

Addresses: School of Computer and Software, Jiangsu Engineering Center of Network Monitoring, Nanjing University of Information Science and Technology, China; State Key Laboratory for Novel Software Technology, Nanjing University, Nanjing, Jiangsu, China ' School of Computer and Software, Jiangsu Engineering Center of Network Monitoring, Nanjing University of Information Science and Technology, China ' School of Computer and Software, Jiangsu Engineering Center of Network Monitoring, Nanjing University of Information Science and Technology, China ' School of Computer and Software, Jiangsu Engineering Center of Network Monitoring, Nanjing University of Information Science and Technology, China ' School of Computer and Software, Jiangsu Engineering Center of Network Monitoring, Nanjing University of Information Science and Technology, China ' Department of Economics, Finance, Insurance, and Risk Management, University of Central Arkansas, USA

Abstract: Deep learning is currently developing very fast in the NLP field and has achieved many amazing results in the past few years. Automatic text summarisation means that the abstract of the document is automatically summarised by a computer program without changing the original intention of the document. There are many application scenarios for automatic summarisation, such as news headline generation, scientific document abstract generation, search result segment generation, and product review summarisation. In the era of internet big data in the information explosion, if the short text can be employed to express the main connotation of information, it will undoubtedly help to alleviate the problem of information overload. In this paper, a model based on the long short-term memory network is presented to automatically analyse and summarise Chinese articles by using the seq2seq+attention models. Finally, the experimental results are attached and evaluated.

Keywords: text summarisation; deep NLP; TensorFlow; recursive neural network; RNN; long short-term memory; LSTM; Seq2Seq; attention; Jieba; separate words; language model.

DOI: 10.1504/IJCSE.2020.107243

International Journal of Computational Science and Engineering, 2020 Vol.22 No.1, pp.39 - 49

Received: 14 Feb 2019
Accepted: 05 May 2019

Published online: 11 May 2020 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article