Amazing technological breakthrough possible @S-Logix

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • +91- 81240 01111

Social List

Towards class-imbalance aware multi-label learning - 2020

Towards Class-Imbalance Aware Multi-Label Learning

Research Area:  Machine Learning


Multi-label learning deals with training examples each represented by a single instance while associated with multiple class labels. Due to the exponential number of possible label sets to be considered by the predictive model, it is commonly assumed that label correlations should be well exploited to design an effective multi-label learning approach. On the other hand, class-imbalance stands as an intrinsic property of multi-label data which significantly affects the generalization performance of the multi-label predictive model. For each class label, the number of training examples with positive labeling assignment is generally much less than those with negative labeling assignment. To deal with the class-imbalance issue for multi-label learning, a simple yet effective class-imbalance aware learning strategy called cross-coupling aggregation (Cocoa) is proposed in this article. Specifically, Cocoa works by leveraging the exploitation of label correlations as well as the exploration of class-imbalance simultaneously. For each class label, a number of multiclass imbalance learners are induced by randomly coupling with other labels, whose predictions on the unseen instance are aggregated to determine the corresponding labeling relevancy. Extensive experiments on 18 benchmark datasets clearly validate the effectiveness of Cocoa against state-of-the-art multi-label learning approaches especially in terms of imbalance-specific evaluation metrics.


Author(s) Name:  Min-Ling Zhang; Yu-Kun Li; Hao Yang; Xu-Ying Liu

Journal name:  IEEE Transactions on Cybernetics ( Early Access )

Conferrence name:  

Publisher name:  IEEE

DOI:  10.1109/TCYB.2020.3027509

Volume Information:   Page(s): 1 - 13