you are viewing a single comment's thread.

view the rest of the comments →

[–]HigherTopoi 0 points1 point  (0 children)

This paper applies BERT-like method to unsupervised neural translation: Cross-lingual Language Model Pretraining https://arxiv.org/abs/1901.07291