Machine Translation Weekly 49: Paraphrasing using multilingual MT

It is a well-known fact that when you have a hammer, everything looks like a
nail. It is a less-known fact that when you have a sequence-to-sequence model,
everything looks like machine translation. One example of this thinking is the
paper Paraphrase Generation as Zero-Shot Multilingual Translation:
Disentangling Semantic Similarity from Lexical and Syntactic
Diversity
recently uploaded to arXiv by
researchers from Johns Hopkins University.

The paper approaches the task of paraphrase generation, i.e., for a source
sentence, they want to generate a target sentence in the same language, with
the meaning as similar as possible to the source sentence, but worded as
differently as possible. Their approach does not need any training examples of
paraphrased sentence pairs. It only needs a multilingual machine translation
system (which is indeed a complex system that not everyone just has on their
hard drives for case). The training requires plenty of parallel

To finish reading, please visit source site

Leave a Reply