Research Area:  Machine Learning
Knowledge graphs (KGs) can structurally organize large-scale information in the form of triples and significantly support many real-world applications. While most KG embedding algorithms hold the assumption that all triples are correct, considerable errors were inevitably injected during the construction process. It is urgent to develop effective error-aware KG embedding, since errors in KGs would lead to significant performance degradation in downstream applications. To this end, we propose a novel framework named Attributed Error-aware Knowledge Embedding (AEKE). It leverages the semantics contained in entity attributes to guide the KG embedding model learning against the impact of erroneous triples. We design two triple-level hypergraphs to model the topological structures of the KG and its attributes, respectively. The confidence score of each triple is jointly calculated based on self-contradictory within the triple, consistency between local and global structures, and homogeneity between structures and attributes. We leverage confidence scores to adaptively update the weighted aggregation in the multi-view graph learning framework and margin loss in KG embedding, such that potential errors will contribute little to KG learning. Experiments on three real-world KGs demonstrate that AEKE outperforms state-of-the-art KG embedding and error detection algorithms.
Keywords:  
Author(s) Name:  Qinggang Zhang, Junnan Dong, Qiaoyu Tan, Xiao Huang
Journal name:  IEEE Transactions on Knowledge and Data Engineering
Conferrence name:  
Publisher name:  IEEE
DOI:  10.1109/TKDE.2023.3310149
Volume Information:  Volume 36, Pages 1667-1682, (2023)
Paper Link:   https://ieeexplore.ieee.org/document/10239484