Issue #13 – Evaluation of Neural MT Architectures

11 Oct18 Issue #13 – Evaluation of Neural MT Architectures Author: Raj Nath Patel, Machine Translation Scientist @ Iconic What are the different approaches to Neural MT? Since its relatively recent advent, the underlying technology has been based on one of three main architectures: Recurrent Neural Networks (RNN) Convolutional Neural Networks (CNN) Self-Attention Networks (Transformer) For various language pairs, non-recurrent architectures (CNN and Transformer) have outperformed RNNs but there has not been any solid explanations as to why. In this post, we’ll evaluate […]

Read more

Issue #12 – Character-based Neural MT

04 Oct18 Issue #12 – Character-based Neural MT Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic Most flavours of Machine Translation naturally use the word as the basis for learning models. Early work on Neural MT that followed this approach had to limit the vocabulary scope for practical reasons.. This created problems when dealing with out-of-vocabulary words. One approach that was explored to solve this problem was character-based Neural MT. With the emergence of subword approaches, which almost solves the […]

Read more

Issue #11 – Unsupervised Neural MT

27 Sep18 Issue #11 – Unsupervised Neural MT Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic In this week’s article, we will explore unsupervised machine translation. In other words, training a machine translation engine without using any parallel data! As you might imagine, the potential implications of not needing any data to train a Neural MT engine could be huge. In general, most of the approaches in this direction still use some bilingual signal, for example using parallel data […]

Read more

Issue #10 – Evaluating Neural MT post-editing

20 Sep18 Issue #10 – Evaluating Neural MT post-editing Author: Dr. Joss Moorkens, Assistant Professor, Dublin City University This week, we have a guest post from Prof. Joss Moorkens of Dublin City University. Joss is renowned for his work in the area of translation technology and, particularly, the evaluation of MT output for certain use cases. Building on the “human parity” topic from Issue #8 of this series, Joss describes his recent work on evaluation of Neural MT post-editing for […]

Read more

Issue #9 – Domain Adaptation for Neural MT

13 Sep18 Issue #9 – Domain Adaptation for Neural MT Author: Raj Nath Patel, Machine Translation Scientist @ Iconic While Neural MT has raised the bar in terms of the quality of general purpose machine translation, it is still limited when it comes to more intricate or technical use cases. That is where domain adaptation — the process of developing and adapting MT for specific industries, content types, and use cases — has a big part to play. In this […]

Read more

Issue #8 – Is Neural MT on par with human translation?

05 Sep18 Issue #8 – Is Neural MT on par with human translation? Author: Dr. John Tinsley, CEO @ Iconic The next few articles of the Neural MT Weekly will deal with the topic of quality and evaluation of machine translation. Since the advent of Neural MT, developments have moved fast, and we have seen quality expectation levels rise, in line with a number of striking proclamations about performance. Early claims of “bridging the gap between human and machine translation” […]

Read more

Issue #7 – Terminology in Neural MT

30 Aug18 Issue #7 – Terminology in Neural MT Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic In many commercial MT use cases, being able to use custom terminology is a key requirement in terms of accuracy of the translation. The ability to guarantee the translation of specific input words and phrases is conveniently handled in Statistical MT (SMT) frameworks such as Moses. Because SMT is performed as a sequence of distinct steps, we can interject and specify directly […]

Read more

Issue #5 – Creating training data for Neural MT

15 Aug18 Issue #5 – Creating training data for Neural MT Author: Prof. Andy Way, Deputy Director, ADAPT Research Centre This week, we have a guest post from Prof. Andy Way of the ADAPT Research Centre in Dublin. Andy leads a world-class team of researchers at ADAPT who are working at the very forefront of Neural MT. The post expands on the topic of training data – originally presented as one of the “6 Challenges in NMT” from Issue #4 […]

Read more

Issue #4 – Six Challenges in Neural MT

08 Aug18 Issue #4 – Six Challenges in Neural MT Author: Dr. John Tinsley, CEO @ Iconic A little over a year ago, Koehn and Knowles (2017) wrote a very appropriate paper entitled “Six Challenges in Neural Machine Translation” (in fact, there were 7 but only 6 were empirically tested). The paper set out a number of areas which, despite its rapid development, still needed to be addressed by researchers and developers of Neural MT. The seven challenges posed at […]

Read more

Issue #2 – Data Cleaning for Neural MT

25 Jul18 Issue #2 – Data Cleaning for Neural MT Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic “Garbage in, Garbage out” – noisy data is a big problem for all machine learning tasks, and MT is no different. By noisy data, we mean bad alignments, poor translations, misspellings, and other inconsistencies in the data used to train the systems. Statistical MT systems are more robust, and can cope with up to 10% noise in the training data without […]

Read more
1 906 907 908 909