Issue #68 – Incorporating BERT in Neural MT
07 Feb20 Issue #68 – Incorporating BERT in Neural MT Author: Raj Patel, Machine Translation Scientist @ Iconic BERT (Bidirectional Encoder Representations from Transformers) has shown impressive results in various Natural Language Processing (NLP) tasks. However, how to effectively apply BERT in Neural MT has not been fully explored. In general, BERT is used as fine-tuning for downstream NLP tasks. For Neural MT, a pre-trained BERT model is used to initialise the encoder in an encoder-decoder architecture. In this post we […]
Read more