List of Topics:
Location Research Breakthrough Possible @S-Logix pro@slogix.in

Office Address

Social List

BERT for Joint Intent Classification and Slot Filling - 2019

Bert For Joint Intent Classification And Slot Filling

Research Area:  Machine Learning

Abstract:

Intent classification and slot filling are two essential tasks for natural language understanding. They often suffer from small-scale human-labeled training data, resulting in poor generalization capability, especially for rare words. Recently a new language representation model, BERT (Bidirectional Encoder Representations from Transformers), facilitates pre-training deep bidirectional representations on large-scale unlabeled corpora, and has created state-of-the-art models for a wide variety of natural language processing tasks after simple fine-tuning. However, there has not been much effort on exploring BERT for natural language understanding. In this work, we propose a joint intent classification and slot filling model based on BERT. Experimental results demonstrate that our proposed model achieves significant improvement on intent classification accuracy, slot filling F1, and sentence-level semantic frame accuracy on several public benchmark datasets, compared to the attention-based recurrent neural network models and slot-gated models.

Keywords:  

Author(s) Name:  Qian Chen, Zhu Zhuo, Wen Wang

Journal name:  Computer Science

Conferrence name:  

Publisher name:  arXiv:1902.10909

DOI:  10.48550/arXiv.1902.10909

Volume Information: