Improving Neural Cross-lingual Abstractive Summarization via Employing Optimal Transport Distance for Knowledge Distillation

This repository contains the implementation of the paper Improving Neural Cross-lingual Abstractive Summarization via Employing Optimal Transport Distance for Knowledge Distillation.

Thong Nguyen, Luu Anh Tuan (AAAI 2022)

Teaser image
In this paper, we propose a novel Knowledge Distillation framework to tackle Neural Cross-Lingual Summarization for morphologically or structurally distant languages. In our framework, we propose a novel Knowledge Distillation
framework to tackle Neural Cross-Lingual Summarization for morphologically or structurally distant languages. Extensive experiments in both high and low-resourced settings on multiple Cross-Lingual Summarization datasets that belong to pairs of morphologically and structurally distant languages demonstrate that extensive experiments in both high and low-resourced settings on multiple Cross-Lingual Summarization datasets that belong to pairs of morphologically and structurally distant

 

 

 

To finish reading, please visit source site