Issue #58 – Quantisation of Neural Machine Translation models

31 Oct19 Issue #58 – Quantisation of Neural Machine Translation models Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic When large amounts of training data are available, the quality of Neural MT engines increases with the size of the model. However, larger models imply decoding with more parameters, which makes the engine slower at test time. Improving the trade-off between model compactness and translation quality is an active research topic. One of the ways to achieve more compact models […]

Read more

Issue #57 – Simple and Effective Noisy Channel Modeling for Neural MT

24 Oct19 Issue #57 – Simple and Effective Noisy Channel Modeling for Neural MT Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic Neural MT is widely used today and the results are undeniably better compared to the statistical machine translation (SMT) used earlier. One of the core components of an SMT system was the language model. In this post, we will look at how we can benefit from a language model in Neural MT, too. In particular, we will […]

Read more

Issue #56 – Scalable Adaptation for Neural Machine Translation

17 Oct19 Issue #56 – Scalable Adaptation for Neural Machine Translation Author: Raj Patel, Machine Translation Scientist @ Iconic Although current research has explored numerous approaches for adapting Neural MT engines to different languages and domains, fine-tuning remains the most common approach. In fine-tuning, the parameters of a pre-trained model are updated for the target language or domain in question. However, fine-tuning requires training and maintenance of a separate model for each target task (i.e. a separate MT engine for every […]

Read more

Issue #55 – Word Alignment from Neural Machine Translation

10 Oct19 Issue #55 – Word Alignment from Neural Machine Translation Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic Word alignments were the cornerstone of all previous approaches to statistical MT. You take your parallel corpus, align the words, and build from there. In Neural MT however, word alignment is no longer needed as an input of the system. That being said, research is coming back around to the idea that it remains useful in real-world practical scenarios for […]

Read more

Issue #54 – Pivot-based Transfer Learning for Neural MT

03 Oct19 Issue #54 – Pivot-based Transfer Learning for Neural MT Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic Pivot-based Transfer Learning for Neural MT between Non-English Languages Neural MT for many non-English languages is still a challenge because of the unavailability of direct parallel data between these languages. In general, translation between non-English languages, e.g. French to German, is usually done with pivoting through English, i.e., translating French (source) input to English (pivot) and English (pivot) into German. […]

Read more

Issue #52 – A Selection from ACL 2019

19 Sep19 Issue #52 – A Selection from ACL 2019 Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic The Conference of the Association for Computational Linguistics (ACL) took place this summer, and over the past few months we have reviewed a number of preprints (see Issues 28, 41 and 43) which were published at ACL. In this post, we take a look at three more papers presented at the conference, that we found particularly interesting, in the context of […]

Read more

Issue #51 – Takeaways from the 2019 Machine Translation Summit

12 Sep19 Issue #51 – Takeaways from the 2019 Machine Translation Summit Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic As you will have probably seen across our website and channels, the Machine Translation Summit took place a few weeks ago on our doorstep in Dublin, Ireland. In addition to sponsoring, hosting the MT social, and presenting our own paper, we also attended many talks, had a lot of great conversations, and since the conference, we have spent some […]

Read more

Issue #50 – Previewing the 2019 Machine Translation Summit

15 Aug19 Issue #50 – Previewing the 2019 Machine Translation Summit Author: The Iconic Scientific Team! This week marks the 50th issue of The Neural MT Weekly. This is a remarkable milestone as we originally set out for this to be an 8-part series. However, as the pace of research in this area continued without cease, and our readership grew, we simply had to keep going. The research into these posts forms part of our weekly MT reading group in […]

Read more

Issue #49 – Representation Bottleneck in Neural MT

08 Aug19 Issue #49 – Representation Bottleneck in Neural MT Author: Raj Patel, Machine Translation Scientist @ Iconic In Neural MT, lexical features are fed to the network as lexical representations (aka word embeddings) to the first layer of the encoder and refined as propagate through the deep network of hidden layers. In this post we’ll try to understand how the lexical representation is affected as it goes deeper in the network and investigate if it affects the translation quality. Representation […]

Read more

Issue #48 – It’s all French Belgian Fries to me… or The Art of Multilingual e-Disclosure (Part II)

01 Aug19 Issue #48 – It’s all French Belgian Fries to me… or The Art of Multilingual e-Disclosure (Part II) Author: Jérôme Torres Lozano, Director of Professional Services, Inventus This is the second of a two-part guest post from Jérôme Torres Lozano, the Director of Professional Services at Inventus, who shares his perspective on The Art of Multilingual e-Disclosure. In Part I,  we learned about the challenges of languages in e-disclosure.  In this post he will discuss language identification and translation options available […]

Read more
1 898 899 900 901 902 905