Machine Translation Weekly 72: Self-Training for Zero-Shot MT

This week, I will have a look at a pre-print that describes an unconventional
setup for zero-shot machine translation. The title of the pre-print is
Self-Learning for Zero-Shot Neural Machine
Translation
and was written by authors from
the University of Trento.

First of all, I have some doubt about this being really an instance of
zero-shot learning (but it is just nitpicking, the paper is interesting
regardless of the terminology). In machine learning, zero-shot learning means
that a model trained for task A is capable of doing task B without being
explicitly trained for that. An example can be: a model is trained to perform
sentiment analysis in English, but it can also do it in German because it was
trained on top of multilingual representation. This would supervised

 

 

To finish reading, please visit source site

Leave a Reply