Research on multi-feature fusion entity relation extraction based on deep learning
by Shiao Xu; Shuihua Sun; Zhiyuan Zhang; Fan Xu
International Journal of Ad Hoc and Ubiquitous Computing (IJAHUC), Vol. 39, No. 1/2, 2022

Abstract: Entity relation extraction aims to identify the semantic relation category between the target entity pairs in the original text and is one of the core technologies of tasks such as automatic document summarisation, automatic question answering system, and machine translation. Aiming at the problems in the existing relation extraction model that the local feature extraction of the text is insufficient and the semantic interaction information between the entities is easily ignored, this paper proposes a novel entity relationship extraction model. The model utilises a multi-window convolutional neural network (CNN) to capture multiple local features on the shortest dependency path (SDP) between entities, applies segmented bidirectional long short-term memory (BiLSTM) attention mechanism, extracts the global features in the original input sequence, and merges the local features with the global features to extract entity relations. The experimental results on the SemEval-2010 Task 8 dataset show that the model's entity relation extraction performance is further improved than existing methods.

Online publication date: Fri, 18-Feb-2022

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Ad Hoc and Ubiquitous Computing (IJAHUC):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com