Machine Translation Weekly 50: Language-Agnostic Multilingual Representations

Pre-trained multilingual representations promise to make the current best NLP
model available even for low-resource languages. With a truly language-neutral
pre-trained multilingual representation, we could train a task-specific model
for English (or another language with available training data) and such a model
would work for all languages the representation model can work with.
(Except that by doing so, the models might transfer Western values into
low-resource language applications.)

There are several multilingual contextual embeddings models (such as
multilingual
BERT
or
XLM-R) covering over one hundred languages
that claim to be capable of being language-neutral enough to work in this
so-called zero-shot learning setup (i.e., a model is trained on one language,
applied on another one). The models are very good indeed, but they are still
quite far from being language neutral. A recent pre-print from the Technical
University Darmstadt and the University of Copenhagen offers several remedies
for

To finish reading, please visit source site

Leave a Reply