Text generation is a widely used sub-field of Natural Language Processing (NLP) and generates informative text similar to human written text. It automatically generates text by utilizing the knowledge of computational linguistics and artificial intelligence. Word embeddings are the vector representation that is dominantly used in text generation for words with similar meanings that possess similar representation. Contextual word embedding for text generation allocates each word a representation based on its context by utilizing words across various contexts and encoding knowledge that transfers over languages. Contextual word embedding outperforms traditional word representation by capturing many syntactic and semantic properties under different linguistic contexts. Deep learning approaches are efficient in NLP tasks and especially achieve great progress in text generation. Deep neural networks own the ability to generate smooth, topic-consistent, and even personalized text. Deep neural network-based contextual word embedding for text generation learns the distributed representation efficiently and generates meaningful text.