Machine Translation and Multilinguality in July 2022

Here is my monthly summary of what I found worth reading on arXiv in the past

A preprint from JHU studies zero-shot
cross-lingual transfer using pretrained multilingual representation and comes
to the conclusion that it is an under-specified optimization problem. In other
words, with a multilingual representation model, there are potentially many
solutions that are good for the source language, but only some of them are good
for the target language. In practice, the solution is probably proper training
in the source language and few-shot training in the target language. (After
all, a zero-shot setup is always more of theoretical interest than
practical. Collecting a dozen of labeled examples should always be

Sockey 3, an MT toolkit by Amazon is out
and is written in PyTorch



To finish reading, please visit source site

Leave a Reply