Issue #66 – Neural Machine Translation Strategies for Low-Resource Languages

23 Jan20

Issue #66 – Neural Machine Translation Strategies for Low-Resource Languages

This week we are pleased to welcome the newest member to our scientific team, Dr. Chao-Hong Liu. In this, his first post with us, he’ll give his views on two specific MT strategies, namely, pivot MT and zero-shot MT. While we have covered these topics in previous ‘Neural MT Weekly’ blog posts (Issue #54, Issue #40), these are topics that Chao-Hong has recently worked on prior to joining Iconic. Take it away, Chao-Hong! 

Author: Dr. Chao-Hong Liu, Machine Translation Scientist @ Iconic

In this post we will briefly review and discuss two main strategies, i.e. pivot MT and zero-shot MT, that were used to build neural machine translation (NMT) models without direct parallel data. The main reasons for developing these methods are that we wanted to build MT systems for language pairs where direct parallel corpora do not exist or are very small. These methods are especially useful when we are building MT systems for low-resource languages. However, they could also be used in other situations, e.g. the training of MT systems for specific domains.

Pivot Machine Translation

The strategy of pivot MT is to
To finish reading, please visit source site

Leave a Reply