In the field of Natural Language Generation (NLG), Text generation is one of the important applications of the language model. Existing language models learn the function of mapping the sequence of words to the successive word and generating text for a given set of input words regardless of pragmatics and without modeling the contextual information involved in the text sentence.
Existing techniques of training the language model are often inconsistent with the goals of many language generation tasks, such as generative question answering and conversational response generation, for producing new text given context. Thus, Pre-trained deep learning-based neural language models are necessary to learn the language patterns on a large unlabeled corpus; it learns accurately how words co-occur with one another for text prediction. In contrast, task bots must detect user intents and take a sequence of goal-directed actions, grounded in task-specific knowledge and dialog belief state, for task completion.