Articles About Natural Language Processing

Issue #54 – Pivot-based Transfer Learning for Neural MT

03 Oct19 Issue #54 – Pivot-based Transfer Learning for Neural MT Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic Pivot-based Transfer Learning for Neural MT between Non-English Languages Neural MT for many non-English languages is still a challenge because of the unavailability of direct parallel data between these languages. In general, translation between non-English languages, e.g. French to German, is usually done with pivoting through English, i.e., translating French (source) input to English (pivot) and English (pivot) into German. […]

Read more

Issue #52 – A Selection from ACL 2019

19 Sep19 Issue #52 – A Selection from ACL 2019 Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic The Conference of the Association for Computational Linguistics (ACL) took place this summer, and over the past few months we have reviewed a number of preprints (see Issues 28, 41 and 43) which were published at ACL. In this post, we take a look at three more papers presented at the conference, that we found particularly interesting, in the context of […]

Read more

Issue #51 – Takeaways from the 2019 Machine Translation Summit

12 Sep19 Issue #51 – Takeaways from the 2019 Machine Translation Summit Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic As you will have probably seen across our website and channels, the Machine Translation Summit took place a few weeks ago on our doorstep in Dublin, Ireland. In addition to sponsoring, hosting the MT social, and presenting our own paper, we also attended many talks, had a lot of great conversations, and since the conference, we have spent some […]

Read more

Issue #50 – Previewing the 2019 Machine Translation Summit

15 Aug19 Issue #50 – Previewing the 2019 Machine Translation Summit Author: The Iconic Scientific Team! This week marks the 50th issue of The Neural MT Weekly. This is a remarkable milestone as we originally set out for this to be an 8-part series. However, as the pace of research in this area continued without cease, and our readership grew, we simply had to keep going. The research into these posts forms part of our weekly MT reading group in […]

Read more

Issue #49 – Representation Bottleneck in Neural MT

08 Aug19 Issue #49 – Representation Bottleneck in Neural MT Author: Raj Patel, Machine Translation Scientist @ Iconic In Neural MT, lexical features are fed to the network as lexical representations (aka word embeddings) to the first layer of the encoder and refined as propagate through the deep network of hidden layers. In this post we’ll try to understand how the lexical representation is affected as it goes deeper in the network and investigate if it affects the translation quality. Representation […]

Read more

Issue #48 – It’s all French Belgian Fries to me… or The Art of Multilingual e-Disclosure (Part II)

01 Aug19 Issue #48 – It’s all French Belgian Fries to me… or The Art of Multilingual e-Disclosure (Part II) Author: Jérôme Torres Lozano, Director of Professional Services, Inventus This is the second of a two-part guest post from Jérôme Torres Lozano, the Director of Professional Services at Inventus, who shares his perspective on The Art of Multilingual e-Disclosure. In Part I,  we learned about the challenges of languages in e-disclosure.  In this post he will discuss language identification and translation options available […]

Read more

Issue #47 – It’s all French Belgian Fries to me, or The Art of Multilingual e-Disclosure (Part I)

25 Jul19 Issue #47 – It’s all French Belgian Fries to me, or The Art of Multilingual e-Disclosure (Part I) Author: Jérôme Torres Lozano, Director of Professional Services, Inventus Over the next two weeks, we’re taking a slightly different approach on the blog. In today’s article, the first of two parts, we will hear from Jérôme Torres-Lozano of Inventus, a user of Iconic’s Neural MT solutions for e-discovery. He gives us an entertaining look at his experiences on the challenges of language, […]

Read more

Issue #46 – Augmenting Self-attention with Persistent Memory

18 Jul19 Issue #46 – Augmenting Self-attention with Persistent Memory Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic In Issue #32 we introduced the Transformer model as the new state-of-the-art in Neural Machine Translation. Subsequently, in Issue #41 we looked at some approaches that were aiming to improve upon it. In this post, we take a look at significant change in the Transformer model, proposed by Sukhbaatar et al. (2019), which further improves its performance. Each Transformer layer consists of two types […]

Read more

Issue #45 – Improving Robustness in Real-World Neural Machine Translation

11 Jul19 Issue #45 – Improving Robustness in Real-World Neural Machine Translation Author: Dr. John Tinsley, CEO & Co-founder @ Iconic Next month, the 17th Machine Translation Summit will take place in Dublin, Ireland and the Iconic team will be in attendance. Not only that, we will be presenting our own work – Gupta et al. (2019) – on some of the steps we take to improve the robustness, stability, and quality of the Neural MT engines that we run […]

Read more

Issue #44 – Tagged Back-Translation for Neural Machine Translation

04 Jul19 Issue #44 – Tagged Back-Translation for Neural Machine Translation Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic Note from the editor: You may have noticed that our posts have been a little more technical in nature over the last few weeks. This is reflective of a broader trend in R&D whereby higher level topics and “breakthroughs” have been covered, and scientists are now drilling down to optimise existing approaches. This can be seen again in today’s on […]

Read more
1 65 66 67 68 69 71