Issue #44 – Tagged Back-Translation for Neural Machine Translation

04 Jul19

Issue #44 – Tagged Back-Translation for Neural Machine Translation

Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic

Note from the editor: You may have noticed that our posts have been a little more technical in nature over the last few weeks. This is reflective of a broader trend in R&D whereby higher level topics and “breakthroughs” have been covered, and scientists are now drilling down to optimise existing approaches. This can be seen again in today’s on the topic of synthetic data. However, over the coming weeks and months, we will also be looking at Neural MT from a few different angles, with some guest posts, use cases, and, of course, a preview and review of the upcoming “Machine Translation Summit” which is taking place literally 5 minutes from Iconic HQ in Dublin, Ireland next month. Stay tuned!

In this week’s post we take a look at some new research which changes the outlook for back-translation in Neural MT. Back-translation (BT) consists of adding synthetic training data produced by translating monolingual data in the target language into the source language (in this way no noise is introduced in the target language).

In a previous post (To finish reading, please visit source site

Leave a Reply