Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Research Proposal on Deep Autoencoder based Text Generation for Natural Language

Research Proposal on Deep Autoencoder based Text Generation for Natural Language

  Text generation in Natural Language Processing (NLP) automatically generates informative text similar to human written text by utilizing the knowledge of computational linguistics and artificial intelligence. Deep learning approaches are efficient in NLP tasks and especially achieve great progress in text generation. Deep learning model for text generation learns the vector representation, trains the deep neural network to predict the next word in the sequence, and generates meaningful text. In deep learning, labeling unstructured data is very time-consuming. One of the approaches in the deep learning model to address this issue is autoencoders.

  Autoencoders utilize unsupervised training to work on unlabeled data and comprise encoder and decoder. Autoencoder in text generation with no labeled data provides better performance than supervised systems. Deep autoencoder-based text generation for natural language text significantly improves the quality of generated responses and is effectively applied in various text generation tasks.

  • Exploiting deep learning in text generation has gained more attractive research due to its ability to make Artificial Intelligence(AI) extremely close to human capabilities.

  • Presently, the unsupervised learning strategy of the deep learning model is utilized for natural language generation instead of a supervised learning framework.

  • An unsupervised learning strategy yields better performance and improves the quality of generated responses in text generation.

  • A deep auto-encoder is an unsupervised deep neural network that supports representation learning by eliminating the need for data labeling.

  • Utilizing auto-encoder for dialogue generation provides a critical dependency relation to generating coherent and fluent responses with learned the utterance-level semantic representations.

  • Another novel improvement in unsupervised natural language generation is denoising auto-encoder that effectively generates correct sentences from structured data.