Highlights from Machine Translation and Multilinguality in November 2023

Here are a couple of articles that caught my attention in November.

Narrowing the Gap between Zero- and Few-shot Machine Translation by Matching Styles

A team from Johns Hopkins University published a pre-print that belongs to the
currently trendy genre: stuff we can do with LLMs. This time, it is about how
to use it efficiently for domain-specific machine translation. It is known that
few-shot prompting works much better than zero-shot prompting, but you need to
select proper parallel examples. In other words, you need an in-domain parallel
corpus for domain-specific translation. This paper says that the LLMs already
know how to translate; they only need a hint of style or domain, which can be
monolingual, so no parallel data is required. They propose to do the
translation

 

 

To finish reading, please visit source site

Leave a Reply