Keeping Models Consistent between Pretraining and Translation for Low-Resource Neural Machine Translation

Recently, the pretraining of models has been successfully applied to unsupervised and semi-supervised neural machine translation. A cross-lingual language model uses a pretrained masked language model to initialize the encoder and decoder of the translation model, which greatly improves the translat...

Full description

Bibliographic Details
Main Authors: Wenbo Zhang, Xiao Li, Yating Yang, Rui Dong, Gongxu Luo
Format: Article
Language:English
Published: MDPI AG 2020-11-01
Series:Future Internet
Subjects:
Online Access:https://www.mdpi.com/1999-5903/12/12/215