Issue #118 – EDITOR: a Repositioning Transformer with Soft Lexical Constraints

18 Feb21

Issue #118 – EDITOR: a Repositioning Transformer with Soft Lexical Constraints

Author: Dr. Karin Sim, Machine Translation Scientist @ Iconic

EDITOR: an Edit-Based Transformer with Repositioning for Neural MT with Soft Lexical Constraints

Introduction

On our blog a couple of weeks ago (issue 116), Patrik explored fully non-autoregressive machine translation, highlighting the tricks such as dependency reduction that enabled quality to be maintained while retaining the speed-up gains over autoregressive MT. Today we revisit non-autoregressive translation (NAT), examining EDITOR by Xu and Carpuat (2020) which builds on the Levenshtein Transformer (see issues 82 and 86 for a recap), to bring both faster decoding and improvements in translation quality.

In a nutshell

EDITOR introduces a new reposition operation which is more effective and

 

 

To finish reading, please visit source site