Research Area:  Machine Learning
Large-scale pretrained language models define state of the art in natural language processing, achieving outstanding performance on a variety of tasks. We study how these architectures can be applied and adapted for natural language generation, comparing a number of architectural and training schemes. We focus in particular on open-domain dialog as a typical high entropy generation task, presenting and comparing different architectures for adapting pretrained models with state of the art results.
Keywords:  
Author(s) Name:  Sergey Golovanov, Rauf Kurbanov, Sergey Nikolenko, Kyryl Truskovskyi, Alexander Tselousov, Thomas Wolf
Journal name:  Association for Computational Linguistics
Conferrence name:  
Publisher name:  ACL
DOI:  10.18653/v1/P19-1608
Volume Information:  
Paper Link:   https://aclanthology.org/P19-1608/