Issue # 33 – Neural MT for Spoken Language Translation

11 Apr19 Issue # 33 – Neural MT for Spoken Language Translation Author: Dr. Marco Turchi, Head of the Machine Translation group at Fondazione Bruno Kessler – FBK , Italy This week, we have a guest post from Dr. Marco Turchi, head of the Machine Translation Group at Fondazione Bruno Kessler (FBK) in Italy. Marco and his team have a very strong pedigree in the field, and have been at the forefront of academic research into Neural MT. In this issue, […]

Read more

Issue #32 – The Transformer Model: State-of-the-art Neural MT

04 Apr19 Issue #32 – The Transformer Model: State-of-the-art Neural MT Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic In this post, we will discuss the Transformer model (Vaswani et al. 2017), which is a state-of-the-art model for Neural MT. The Transformer model was published by Google Brain and Google Research teams in June 2017 and has been a very popular architecture since then. It does not use either Recurrent Neural Networks (RNN) or Convolutional Neural Networks (CNN). Instead, […]

Read more

Issue #31 – Context-aware Neural MT

28 Mar19 Issue #31 – Context-aware Neural MT Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic In this week’s post, we take a look at ‘context-aware’ machine translation. This particular topic deals with how Neural MT engines can make use of external information to determine what translation to product – “external information” meaning information other than the words in the sentence being translated. Other modalities, for instance speech, images, and videos, or even other sentences in the source document […]

Read more

Issue #30 – Reducing loss of meaning in Neural MT

28 Mar19 Issue #30 – Reducing loss of meaning in Neural MT Author: Raj Patel, Machine Translation Scientist @ Iconic An important, and perhaps obvious feature of high-quality machine translation systems is that they preserve the meaning of the source in the translation. That is to say, if we have two different source sentences with slightly different meanings, we should have slightly different translations. However, this nuance can be a challenge, even for state-of-the-art systems, particularly in cases where source […]

Read more

Issue #28 – Hybrid Unsupervised Machine Translation

07 Mar19 Issue #28 – Hybrid Unsupervised Machine Translation Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic In Issue #11 of this series, we first looked directly at the topic of unsupervised machine translation – training an engine without any parallel data. Since then, it has gone from a promising concept, to one that can produce effective systems that perform close to the level of fully supervised engines (trained with parallel data). The prospect of building good MT engines […]

Read more

Issue #27 – Use case: Neural MT for the Life Sciences

27 Feb19 Issue #27 – Use case: Neural MT for the Life Sciences Author: Dr. John Tinsley, CEO @ Iconic Neural MT has had quite a significant impact on how global enterprises are looking at translation automation to improve existing workflows. Above and beyond that, however, organisations are considering how machine translation can transform key areas of their business. The reasons for this are clear – when adapted effectively for a particular use case, Neural MT can produce significantly better, […]

Read more

Issue #26 – Context and Copying in Neural MT

21 Feb19 Issue #26 – Context and Copying in Neural MT Author: Raj Patel, Machine Translation Scientist @ Iconic When translating from one language to another, certain words and tokens need to be copied, and not translated, per se, in the target sentence. This includes things like proper nouns, names, numbers, and ‘unknown’ tokens. We want these to appear in the translation just as they were in the original text. Neural MT systems with subword vocabulary are capable of copying […]

Read more

Issue #24 – Exploring language models for Neural MT

07 Feb19 Issue #24 – Exploring language models for Neural MT Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic Monolingual language models were a critical part of Phrase-based Statistical Machine Translation systems. They are also used in unsupervised Neural MT systems (unsupervised means that no parallel data is available to supervise training, in other words only monolingual data is used). However, they are not used in standard supervised Neural MT engines and training language models have disappeared from common […]

Read more

Issue #23 – Unbiased Neural MT

01 Feb19 Issue #23 – Unbiased Neural MT Author: Raj Patel, Machine Translation Scientist @ Iconic A recent topic of conversation and interest in the area of Neural MT – and Artificial Intelligence in general – is gender bias. Neural models are trained using large text corpora which inherently contain social biases and stereotypes, and as a consequence, translation models inherit these biases. In this article, we’ll try to understand how gender bias affects the translation quality and discuss a […]

Read more

Issue #22 – Mixture Models in Neural MT

24 Jan19 Issue #22 – Mixture Models in Neural MT Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic It goes without saying that Neural Machine Translation has become state of the art in MT. However, one challenge we still face is developing a single general MT system which works well across a variety of different input types. As we know from long-standing research into domain adaptation, a system trained on patent data doesn’t perform well when translating software documentation […]

Read more
1 901 902 903 904 905 906