Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

An Enhanced Data Sparsity Reduction Method for Effective Collaborative Filtering Recommendations - 2020

Research Area:  Machine Learning

Abstract:

Collaborative filtering recommender system suffers from data sparsity problem due to its reliance on numerical ratings to provide recommendations to users. This problem makes it difficult for the system to compute accurate similar neighbours for the items and provide good quality recommendations. Existing methods fail to pre-process the missing ratings of the new items and to predict cold items to the active users which lead to poor quality recommendations. In this work, a sparsity reduction method is presented to improve the quality of recommendations. The method utilises Bi-Separated clustering algorithm to cluster the ratings matrix simultaneously into users and items bi-clusters based on ratings classification. It also employs Bi-Mean Imputation algorithm to fill the missing ratings in the bi-clusters using the estimated means. The method then performs the traditional collaborative filtering process on the new rating matrix for cold items prediction. The experimental results demonstrated that compared to the existing method, the proposed BiSCBiMI improves density of the rating matrix by 5.75%, 10.73% and 7.35% as well as Mean Absolute Error (MAE) of the new items prediction for all of the considered datasets. The results indicated that, the proposed approaches are effective in reducing the data sparsity problem as well as items prediction, which in turn returns good quality recommendations.

Author(s) Name:  Roko, Abubakar; Almu, Abba; Aminu Mohammed; Ibrahim Saidu.

Journal name:  International Journal of Education and Management Engineering

Conferrence name:  

Publisher name:  MECS PRESS

DOI:  10.5815/ijeme.2020.01.04

Volume Information:  Vol. 10, Iss. 1