Research Area:  Machine Learning
Latent Factor Models (LFMs) based on Collaborative Filtering (CF) have been widely applied in many recommendation systems, due to their good performance of prediction accuracy. In addition to users ratings, auxiliary information such as item features is often used to improve performance, especially when ratings are very sparse. To the best of our knowledge, most existing LFMs integrate different item features in the same way for all users. Nevertheless, the attention on different item attributes varies a lot from user to user. For personalized recommendation, it is valuable to know what feature of an item a user cares most about. Besides, the latent vectors used to represent users or items in LFMs have few explicit meanings, which makes it difficult to explain why an item is recommended to a specific user. In this work, we propose the Attention-driven Factor Model (AFM), which can not only integrate item features driven by users attention but also help answer this why. To estimate users attention distributions on different item features, we propose the Gated Attention Units (GAUs) for AFM. The GAUs make it possible to let the latent factors talk, by generating user attention distributions from user latent vectors. With users attention distributions, we can tune the weights of item features for different users. Moreover, users attention distributions can also serve as explanations for our recommendations. Experiments on several real-world datasets demonstrate the advantages of AFM (using GAUs) over competitive baseline algorithms on rating prediction.
Keywords:  
Explainable Personalized Recommendation
Latent Factor Models (LFMs)
Collaborative Filtering (CF)
Machine Learning
Deep Learning
Author(s) Name:  Jingwu Chen, Fuzhen Zhuang, Xin Hong, Xiang Ao, Xing Xie, Qing He
Journal name:  Computer Science
Conferrence name:  
Publisher name:  Semantic Scholar
DOI:  10.1145/3209978.3210083
Volume Information: