Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

TNAM: A tag-aware neural attention model for Top-N recommendation - 2019

Tnam: A Tag-Aware Neural Attention Model For Top-N Recommendation

Research Area:  Machine Learning

Abstract:

Recent work shows that incorporating tag information to recommender systems is promising for improving the recommendation accuracy in social systems. However, existing approaches suffer from less reasonable assignment of tag weights when constructing the user profiles and item characteristics in real-world scenarios, resulting in decreased accuracy in making recommendations. The above issue is specifically summarized into two aspects: 1) the weight of a target item is mainly determined by number of one certain type of tags, and 2) users place equal focus on the same tag for different items. To tackle these problems, we propose a novel model named TNAM, a Tag-aware Neural Attention Model, which accurately captures users special attention to tags of items. In the proposed model, we design a tag-based neural attention network by extracting potential tag information to overcome the difficulty of assigning tag weights for personalized users. We combine user-item interactions with tag information to map sparse data to dense vectors in higher-order space. In this way, TNAM acquires more interrelations between users and items to make recommendations more accurate. Extensive experiments of our model on three publicly implicit feedback datasets reveal significant improvements on the metrics of HR and NDCG in Top-N recommendation tasks over several state-of-the-art approaches.

Keywords:  

Author(s) Name:  Ruoran Huang, Nian Wang, Chuanqi Han, Fang Yu, Li Cui

Journal name:  Neurocomputing

Conferrence name:  

Publisher name:  ELSEVIER

DOI:  https://doi.org/10.1016/j.neucom.2019.11.095

Volume Information:  Volume 385, 14 April 2020, Pages 1-12