A method of automatic text summarisation based on long short-term memory Online publication date: Mon, 11-May-2020
by Wei Fang; TianXiao Jiang; Ke Jiang; Feihong Zhang; Yewen Ding; Jack Sheng
International Journal of Computational Science and Engineering (IJCSE), Vol. 22, No. 1, 2020
Abstract: Deep learning is currently developing very fast in the NLP field and has achieved many amazing results in the past few years. Automatic text summarisation means that the abstract of the document is automatically summarised by a computer program without changing the original intention of the document. There are many application scenarios for automatic summarisation, such as news headline generation, scientific document abstract generation, search result segment generation, and product review summarisation. In the era of internet big data in the information explosion, if the short text can be employed to express the main connotation of information, it will undoubtedly help to alleviate the problem of information overload. In this paper, a model based on the long short-term memory network is presented to automatically analyse and summarise Chinese articles by using the seq2seq+attention models. Finally, the experimental results are attached and evaluated.
Online publication date: Mon, 11-May-2020
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computational Science and Engineering (IJCSE):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email firstname.lastname@example.org