Research Area:  Machine Learning
We present simple BERT-based models for relation extraction and semantic role labeling. In recent years, state-of-the-art performance has been achieved using neural models by incorporating lexical and syntactic features such as part-of-speech tags and dependency trees. In this paper, extensive experiments on datasets for these two tasks show that without using any external features, a simple BERT-based model can achieve state-of-the-art performance. To our knowledge, we are the first to successfully apply BERT in this manner. Our models provide strong baselines for future research.
Keywords:  
Author(s) Name:  Peng Shi, Jimmy Lin
Journal name:  Computer Science
Conferrence name:  
Publisher name:  arXiv:1904.05255
DOI:  10.48550/arXiv.1904.05255
Volume Information:  
Paper Link:   https://arxiv.org/abs/1904.05255