Research breakthrough possible @S-Logix pro@slogix.in

Office Address

  • 2nd Floor, #7a, High School Road, Secretariat Colony Ambattur, Chennai-600053 (Landmark: SRM School) Tamil Nadu, India
  • pro@slogix.in
  • +91- 81240 01111

Social List

K-BERT:Enabling Language Representation with Knowledge Graph - 2020

K-Bert:Enabling Language Representation With Knowledge Graph

Research Area:  Machine Learning

Abstract:

Pre-trained language representation models, such as BERT, capture a general language representation from large-scale corpora, but lack domain-specific knowledge. When reading a domain text, experts make inferences with relevant knowledge. For machines to achieve this capability, we propose a knowledge-enabled language representation model (K-BERT) with knowledge graphs (KGs), in which triples are injected into the sentences as domain knowledge. However, too much knowledge incorporation may divert the sentence from its correct meaning, which is called knowledge noise (KN) issue. To overcome KN, K-BERT introduces soft-position and visible matrix to limit the impact of knowledge. K-BERT can easily inject domain knowledge into the models by being equipped with a KG without pre-training by itself because it is capable of loading model parameters from the pre-trained BERT. Our investigation reveals promising results in twelve NLP tasks. Especially in domain-specific tasks (including finance, law, and medicine), K-BERT significantly outperforms BERT, which demonstrates that K-BERT is an excellent choice for solving the knowledge-driven problems that require experts.

Keywords:  

Author(s) Name:  Weijie Liu ,Peng Zhou, Zhe Zhao ,Zhiruo Wang ,Qi Ju ,Haotang Deng,Ping Wang

Journal name:  

Conferrence name:  Proceedings of the AAAI Conference on Artificial Intelligence

Publisher name:  Association for the Advancement of Artificial Intelligence

DOI:  10.1609/aaai.v34i03.5681

Volume Information: