Issue #52 – A Selection from ACL 2019

19 Sep19

Issue #52 – A Selection from ACL 2019

Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic

The Conference of the Association for Computational Linguistics (ACL) took place this summer, and over the past few months we have reviewed a number of preprints (see Issues 28, 41 and 43) which were published at ACL. In this post, we take a look at three more papers presented at the conference, that we found particularly interesting, in the context of state-of-the-art research in Neural MT. 

Self-Supervised Neural Machine Translation (Ruiter et al.)

In Issue #21 we saw the effectiveness of using NMT model scores to filter noisy parallel corpora. The present paper is based on the claim that the word and sentence representations learned by the NMT encoders are strong enough to judge on-line if an input sentence pair is useful or not for training. 

In this approach, the NMT system is used for simultaneously selecting training data and learning internal NMT representations. This system does not need parallel data, but only comparable data. It selects parallel sentences in the comparable corpus until enough are available to create a training
To finish reading, please visit source site

Leave a Reply