Computer and Modernization ›› 2021, Vol. 0 ›› Issue (07): 38-42.

Previous Articles     Next Articles

Low-resource Neural Machine Translation Based on ELMO

  

  1. (1. School of Computer and Information Technology, Northeast Petroleum University, Daqing 163318, China;
    2. School of Computer Science and Technology, Harbin Institute of Technology, Harbin 150001, China)
  • Online:2021-08-02 Published:2021-08-02

Abstract: The difficulty in low-resource neural machine translation is lack of numerous parallel corpus to train the model. With the development of the pre-training model, it has made great improvements in various natural language processing tasks. In this paper, a neural machine translation model combining ELMO is proposed to solve the problem of low-resource neural machine translation. There are more than 0.7 BLEU improvements in the Turkish-English low-resource translation task compared to the back translation, and more than 0.8 BLEU improvements in the Romanian-English translation task. In addition, compared with the traditional neural machine translation model, the simulated low-resource translation tasks of Chinese-English, French-English, German-English and Spanish-English increase by 2.3, 3.2, 2.6 and 3.2 BLEU respectively. The experimental results show that the ELMO model is effective for low-resource neural machine translation.

Key words: low-resource, parallel corpus, pre-training model, neural machine, translation model