Text Generation Using Bidirectional LSTM – A Walk-through in Tensorflow

This article was published as a part of the Data Science Blogathon

Text Generation

The Text Generation is a Natural Language Processing task that involves automatically generating meaningful texts. We can also utilize the Text Generation process for Autocomplete. Initially, we provide a prompt, which is a text that is used as the base to generate texts. The model will generate texts based on the prompt, the predicted text will be added to the base prompt and it is fed again to the model. In this way, we can generate many texts using the prompt text.

Text generation is also called Natural Language Generation. Text generation can be used in chatbots creation and autocomplete. The quality of the Text Generation depends on the quality of the corpus. A corpus is a set of documents that are used to train the model for text generation. If the quality of the data is not good then the model’s quality is not good as well (Garbage in, Garbage out). In order to

 

 

 

To finish reading, please visit source site