Research Area:  Machine Learning
Recently, incremental and on-line learning gained more attention especially in the context of big data and learning from data streams, conflicting with the traditional assumption of complete data availability. Even though a variety of different methods are available, it often remains unclear which of them is suitable for a specific task and how they perform in comparison to each other. We analyze the key properties of eight popular incremental methods representing different algorithm classes. Thereby, we evaluate them with regards to their on-line classification error as well as to their behavior in the limit. Further, we discuss the often neglected issue of hyperparameter optimization specifically for each method and test how robustly it can be done based on a small set of examples. Our extensive evaluation on data sets with different characteristics gives an overview of the performance with respect to accuracy, convergence speed as well as model complexity, facilitating the choice of the best method for a given application.
Keywords:  
Incremental On-Line Learning
Machine Learning
Deep Learning
Author(s) Name:  Viktor Losing, Barbara Hammer, Heiko Wersing
Journal name:  Neurocomputing
Conferrence name:  
Publisher name:  ELSEVIER
DOI:  10.1016/j.neucom.2017.06.084
Volume Information:  Volume 275, 31 January 2018, Pages 1261-1274
Paper Link:   https://www.sciencedirect.com/science/article/abs/pii/S0925231217315928