KoBE: Knowledge-Based Machine Translation Evaluation

We propose a simple and effective method for machine translation evaluation which does not require reference translations. Our approach is based on (1) grounding the entity mentions found in each source sentence and candidate translation against a large-scale multilingual knowledge base, and (2) measuring the recall of the grounded entities found in the candidate vs. those found in the source… Our approach achieves the highest correlation with human judgements on 9 out of the 18 language pairs from the WMT19 […]

Read more

Issue #74 – Transfer Learning for Neural Machine Translation

20 Mar20 Issue #74 – Transfer Learning for Neural Machine Translation Author: Dr. Chao-Hong Liu, Machine Translation Scientist @ Iconic Building machine translation (MT) for low-resource languages is a challenging task. This is especially true when training using neural MT (NMT) methods that require a comparatively larger corpus of parallel data. In this post, we review the work done by Zoph et al. (2016) on training NMT systems for low-resource languages using transfer learning. Transfer Learning The idea of transfer […]

Read more

Issue #18 – Simultaneous Translation using Neural MT

23 Nov18 Issue #18 – Simultaneous Translation using Neural MT Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic The term “simultaneous translation” or “simultaneous interpretation” refers to the case where a translator begins translating just a few seconds after a speaker begins speaking, and finishes just a few seconds after the speaker ends.  There has been a lot of PR and noise about some recent proclamations which were covered well in a recent article on Slator. In this week’s post, […]

Read more
1 2 3 4