Research Area:  Machine Learning
Cost-sensitive learning can be found in many real-world applications and represents an important learning paradigm in machine learning. The recently proposed cost-sensitive hinge loss support vector machine (CSHL-SVM) guarantees consistency with the cost-sensitive Bayes risk, and this technique provides better generalization accuracy compared to traditional cost-sensitive support vector machines. In practice, data typically appear in the form of sequential chunks, also called an on-line scenario. However, conventional batch learning algorithms waste a considerable amount of time under the on-line scenario due to re-training of a model from scratch. To make CSHL-SVM more practical for the on-line scenario, we propose a chunk incremental learning algorithm for CSHL-SVM, which can update a trained model without re-training from scratch when incorporating a chunk of new samples. Our method is efficient because it can update the trained model for not only one sample at a time but also multiple samples at a time. Our experimental results on a variety of datasets not only confirm the effectiveness of CSHL-SVM but also show that our method is more efficient than the batch algorithm of CSHL-SVM and the incremental learning method of CSHL-SVM only for a single sample.
Keywords:  
Chunk Incremental Learning
Cost-Sensitive Hinge Loss Support Vector Machine
Machine Learning
Deep Learning
Author(s) Name:  Bin Gu, Xin Quan, Yunhua Gu, Victor S. Sheng, Guansheng Zheng
Journal name:  Pattern Recognition
Conferrence name:  
Publisher name:  ELSEVIER
DOI:  10.1016/j.patcog.2018.05.023
Volume Information:  Volume 83, November 2018, Pages 196-208
Paper Link:   https://www.sciencedirect.com/science/article/abs/pii/S0031320318301973