Amazing technological breakthrough possible @S-Logix

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • +91- 81240 01111

Social List

Attention-based context-aware sequential recommendation model - 2020

Attention-Based Context-Aware Sequential Recommendation Model

Research Area:  Machine Learning


Recurrent neural networks (RNN) based recommendation algorithms have been introduced recently as sequence information plays an increasingly important role when modeling user preferences. However, these methods have numerous limitations: they usually give undue importance to sequential changes and place insufficient emphasis on the correlation between adjacent items; additionally, they typically ignore the impacts of context information. To address these issues, we propose an attention-based context-aware sequential recommendation model using Gated Recurrent Unit (GRU), abbreviated as ACA-GRU. First, we consider the impact of context information on recommendations and classify them into four categories, including input context, correlation context, static interest context, and transition context. Then, by redefining the update and reset gate of the GRU unit, we calculate the global sequential state transition of the RNN determined by these contexts, to model the dynamics of user interest. Finally, by leveraging the attention mechanism in the correlation context, the model is able to distinguish the importance of each item in the rating sequence. The impact of outliers that are less informative or less predictive decreases or is ignored. Experimental results indicate that ACA-GRU outperforms state-of-the-art context-aware models as well as sequence recommendation algorithms, demonstrating the effectiveness of the proposed model.


Author(s) Name:  Weihua Yuan,Hong Wang,Xiaomei Yu,Nan Liu,Zhenghao Li

Journal name:  Information Sciences

Conferrence name:  

Publisher name:  Elsevier

DOI:  10.1016/j.ins.2019.09.007

Volume Information:  Volume 510, February 2020, Pages 122-134