Issue #128 – Using Context in Neural MT Training Objectives

29 Apr21

Issue #128 – Using Context in Neural MT Training Objectives

Author: Dr. Danielle Saunders, Research Scientist @ RWS

We have a guest post this week, but it’s not really a “guest” as the recently acquired SDL team joins forces with Iconic as part of RWS! Nevertheless, we are pleased to have Dr. Danielle Saunders describe her most recent paper on using context in Minimum Risk Training to improve machine translation tuning and to fix hallucinations. Enjoy!

Introduction

In the paper “Using Context in Neural Machine Translation Training Objectives” (Saunders et al., 2020) we introduce a robust version of Minimum Risk Training (MRT) for Neural Machine Translation (NMT), and show that it can outperform the standard approach while requiring fewer samples. We also summarize work

 

 

To finish reading, please visit source site