Research Area:  Machine Learning
Knowledge graph embedding is a popular method to predict missing links for knowledge graphs by projecting entities and relations into continuous low-dimension embeddings. Some recent embedding models employ translation-based operations to learn the representations of entities and relations with shallow and linear structures, and others leverage neural networks, especially convolution neural networks, to embed the entities and relations with deep and non-linear structures. However, shallow and linear models limit the extraction capacity of the latent knowledge while deep and non-linear models lead to the overabundance of parameters and the loss of surface and explicit knowledge. In this paper, we propose JointE, which utilizes 1D and 2D convolution operations jointly to alleviate these issues effectively. More specifically, we utilize 2D convolution operations to facilitate the interactions between entities and relations, thereby capturing the latent knowledge sufficiently. To reduce the number of parameters significantly, we innovatively construct 2D convolution filters from internal embeddings rather than using external filters which costs plenty of redundant parameters. Furthermore, we appropriately employ 1D convolution filters over input embeddings to extract the surface and explicit knowledge and preserve it by element-wise addition. Experimental evaluation on five benchmark datasets demonstrates that our model outperforms all other state-of-the-art convolution-based models and simultaneously enhances the parameter efficiency.
Keywords:  
Author(s) Name:  Zhehui Zhou, Can Wang, Yan Feng, and Defang Chen
Journal name:  Knowledge-Based Systems
Conferrence name:  
Publisher name:  ACM Digital Library
DOI:  10.1016/j.knosys.2021.108100
Volume Information:  Volume 240, March 2022
Paper Link:   https://dl.acm.org/doi/abs/10.1016/j.knosys.2021.108100