A novel domain adaption approach for neural machine translation Online publication date: Tue, 08-Sep-2020
by Jin Liu; Xin Zhang; Xiaohu Tian; Jin Wang; Arun Kumar Sangaiah
International Journal of Computational Science and Engineering (IJCSE), Vol. 22, No. 4, 2020
Abstract: Neural machine translation has been widely adopted in modern machine translation as it brings the state-of-the-art performance to large-scale parallel corpora. For real-world applications, high-quality translation for text in a specific domain is crucial. However, performances of general neural machine models drop when being applied in a specific domain. To alleviate this issue, this paper presents a novel method of machine translation, which explores both model fusion algorithm and logarithmic linear interpolation. The method can improve the performance of in-domain translation model, while preserving or even improving the performance of out-domain translation model. This paper has carried out extensive experiments on proposed translation model using the public United Nations corpus. The bilingual evaluation understudy (BLEU) score of the in-domain corpus and the out-domain corpus reaches 30.27 and 43.17 respectively, which shows a certain improvement over existing methods.
Online publication date: Tue, 08-Sep-2020
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computational Science and Engineering (IJCSE):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email firstname.lastname@example.org