Title: English grammar correction based on attention mechanism machine translation
Authors: Kuaile Zhao
Addresses: School of New Energy Vehicle, Zhengzhou Technical College, Zhengzhou, 450121, China
Abstract: Grammatical error correction aims to use computer programs to automatically correct grammatical errors in written texts. At present, the mainstream approach regards it as a monolingual translation task, and error correction is the process of translating 'wrong' sentences into 'correct' sentences. In this paper, we study the methods of error correction performance of ascending syntax from three aspects: model, training algorithm and data enhancement. The vast majority of syntax errors occur in a certain part of the text, but there are also a small number of syntax errors that span multiple segments of the text. We use transformer, the most advanced encoder decoder model based on attention mechanism in current neural machine translation, to model syntax error correction, so as to give consideration to local context information and long-distance dependency in text. The experimental results on two standard datasets show that transformer is significantly superior to models based on recurrent neural networks or convolutional neural networks.
Keywords: correction of English grammatical errors; MT attention mechanism; transformer.
DOI: 10.1504/IJCISTUDIES.2024.144049
International Journal of Computational Intelligence Studies, 2024 Vol.13 No.1/2, pp.78 - 94
Received: 11 Apr 2023
Accepted: 15 Sep 2023
Published online: 22 Jan 2025 *