Machine Translation Weekly 83: On Language Indentity and Zero-Shot Transfer

This week I will comment on two papers on zero-shot cross-lingual model
transfer which do not focus on the representation quality but on the transfer
itself. The title of the first one is Language Embeddings for Typology and
Cross-lingual Transfer Learning
and has
authors from UC Davis. The second is Syntax-augmented Multilingual BERT for
Cross-lingual Transfer
and has authors from
UC LA and Facebook AI. Both papers will appear at this year’s
ACL
.

Just a reminder, zero-shot model transfer means training a model on one
language for which training data exist, but using the model with a different
language at inference time. This magic should be possible thanks to
language-agnostic underlying representation that should capture the sentence
structure and meaning similarly regardless of the language. Limited language

 

 

To finish reading, please visit source site

Leave a Reply