Title: Research on machine reading comprehension based on pre-trained model

Authors: Guanlin Chen; Rutao Yao; Haiwei Zhou; Tian Li; Wujian Yang

Addresses: School of Computer and Computing Science, Zhejiang University City College, Hangzhou, 310015, China; School of Computer, Zhejiang University, Hangzhou, 310027, China ' School of Computer and Computing Science, Zhejiang University City College, Hangzhou, 310015, China; School of Computer, Zhejiang University, Hangzhou, 310027, China ' Hangzhou Real Estate Development Research Center, Hangzhou, 310006, China ' School of Computer and Computing Science, Zhejiang University City College, Hangzhou, 310015, China ' School of Computer and Computing Science, Zhejiang University City College, Hangzhou, 310015, China

Abstract: In order to improve the machine reading ability of the model, this article starts from the high-level semantic information of the text and the concepts of the distillation model. Among the high-level semantic information of the text, part-of-speech information and named entity information are selected as additional information that the model can obtain. Combined with part-of-speech tagging technology and named entity recognition technology, a BERT-HSI model based on a pre-training model and fusion of high-level semantic information is proposed. Later, on the basis of BERT-HSI, the paper started from the perspective of model optimisation, with the concepts in the distillation model as the core, and proposed a capability learning method. Finally, the research method about machine reading comprehension based on the pre-training model is presented, which not only integrates high-level semantic information but also includes the capacity learning process.

Keywords: machine reading comprehension; high-level semantic information; capability learning; pre-training model; attention mechanism.

DOI: 10.1504/IJRIS.2022.126648

International Journal of Reasoning-based Intelligent Systems, 2022 Vol.14 No.4, pp.240 - 246

Received: 01 Apr 2022
Accepted: 19 May 2022

Published online: 31 Oct 2022 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article