Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Graph Neural Transport Networks with Non-local Attentions for Recommender Systems - 2022


Graph Neural Transport Networks with Non-local Attentions | S-Logix

Research Area:  Machine Learning

Abstract:

Graph Neural Networks (GNNs) have emerged as powerful tools for collaborative filtering. A key challenge of recommendations is to distill long-range collaborative signals from user-item graphs. Typically, GNNs generate embeddings of users/items by propagating and aggregating the messages between local neighbors. Thus, the ability of GNNs to capture long-range dependencies heavily depends on their depths. However, simply training deep GNNs has several bottleneck effects, e.g., over-fitting & over-smoothing, which may lead to unexpected results if GNNs are not well regularized. Here we present Graph Optimal Transport Networks (GOTNet) to capture long-range dependencies without increasing the depths of GNNs. Specifically, we perform k-Means clustering on nodes GNN embeddings to obtain graph-level representations (e.g., centroids). We then compute node-centroid attentions, which enable long-range messages to be communicated among distant but similar nodes. Our non-local attention operators work seamlessly with local operators in original GNNs. As such, GOTNet is able to capture both local and non-local messages in graphs by only using shallow GNNs, which avoids the bottleneck effects of deep GNNs. Experimental results demonstrate that GOTNet achieves better performance compared with state-of-the-art GNNs.

Keywords:  
graph neural networks
collaborative filtering
embeddings
bottleneck effect
graph optimal transport networks

Author(s) Name:  Huiyuan Chen, Chin-Chia Michael Yeh, Fei Wang, Hao Yang

Journal name:  

Conferrence name:  WWW 22: Proceedings of the ACM Web Conference 2022

Publisher name:  ACM

DOI:  https://doi.org/10.1145/3485447.3512162

Volume Information:  -