Title: A novel domain adaption approach for neural machine translation

Authors: Jin Liu; Xin Zhang; Xiaohu Tian; Jin Wang; Arun Kumar Sangaiah

Addresses: College of Information Engineering, Shanghai Maritime University, Shanghai, 201306, China ' College of Information Engineering, Shanghai Maritime University, Shanghai, 201306, China ' College of Information Engineering, Shanghai Maritime University, Shanghai, 201306, China ' School of Computer and Communication Engineering, Changsha University of Science and Technology, Changsha, 410114, China; School of Information Science and Engineering, Fujian University of Technology, Fuzhou, 350118, China ' School of Computing Science and Engineering, Vellore Institute of Technology, Tamil Nadu, 632014, India

Abstract: Neural machine translation has been widely adopted in modern machine translation as it brings the state-of-the-art performance to large-scale parallel corpora. For real-world applications, high-quality translation for text in a specific domain is crucial. However, performances of general neural machine models drop when being applied in a specific domain. To alleviate this issue, this paper presents a novel method of machine translation, which explores both model fusion algorithm and logarithmic linear interpolation. The method can improve the performance of in-domain translation model, while preserving or even improving the performance of out-domain translation model. This paper has carried out extensive experiments on proposed translation model using the public United Nations corpus. The bilingual evaluation understudy (BLEU) score of the in-domain corpus and the out-domain corpus reaches 30.27 and 43.17 respectively, which shows a certain improvement over existing methods.

Keywords: neural machine translation; model fusion; domain adaption.

DOI: 10.1504/IJCSE.2020.109404

International Journal of Computational Science and Engineering, 2020 Vol.22 No.4, pp.445 - 453

Received: 03 Apr 2019
Accepted: 30 Oct 2019

Published online: 08 Sep 2020 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article