Best Practices for Data-Efficient Modeling in NLG: How to Train Production-Ready Neural Models with Less Data

Abstract

Natural language generation (NLG) is a critical component in conversational systems, owing to its role of formulating a correct and natural text response. Traditionally, NLG components have been deployed using template-based solutions. Although neural network solutions recently developed in the research community have been shown to provide several benefits, deployment of such model-based solutions has been challenging due to high latency, correctness

 

 

To finish reading, please visit source site