| Applied Sciences | |
| Machine Translation in Low-Resource Languages by an Adversarial Neural Network | |
| Hao Wang1  Mengtao Sun2  Ibrahim A. Hameed2  Mark Pasquine3  | |
| [1] Department of Computer Science, Norwegian University of Science and Technology, 2815 Gjøvik, Norway;Department of ICT and Natural Sciences, Norwegian University of Science and Technology, 6009 Ålesund, Norway;Department of International Business, Norwegian University of Science and Technology, 6009 Ålesund, Norway; | |
| 关键词: machine learning; adversarial machine learning; imbalanced datasets; transfer learning; | |
| DOI : 10.3390/app112210860 | |
| 来源: DOAJ | |
【 摘 要 】
Existing Sequence-to-Sequence (Seq2Seq) Neural Machine Translation (NMT) shows strong capability with High-Resource Languages (HRLs). However, this approach poses serious challenges when processing Low-Resource Languages (LRLs), because the model expression is limited by the training scale of parallel sentence pairs. This study utilizes adversary and transfer learning techniques to mitigate the lack of sentence pairs in LRL corpora. We propose a new Low resource, Adversarial, Cross-lingual (LAC) model for NMT. In terms of the adversary technique, LAC model consists of a generator and discriminator. The generator is a Seq2Seq model that produces the translations from source to target languages, while the discriminator measures the gap between machine and human translations. In addition, we introduce transfer learning on LAC model to help capture the features in rare resources because some languages share the same subject-verb-object grammatical structure. Rather than using the entire pretrained LAC model, we separately utilize the pretrained generator and discriminator. The pretrained discriminator exhibited better performance in all experiments. Experimental results demonstrate that the LAC model achieves higher Bilingual Evaluation Understudy (BLEU) scores and has good potential to augment LRL translations.
【 授权许可】
Unknown