Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Incremental Learning of Object Detectors without Catastrophic Forgetting - 2017

Incremental Learning Of Object Detectors Without Catastrophic Forgetting

Research Paper on Incremental Learning Of Object Detectors Without Catastrophic Forgetting

Research Area:  Machine Learning

Abstract:

Despite their success for object detection, convolutional neural networks are ill-equipped for incremental learning, i.e., adapting the original model trained on a set of classes to additionally detect objects of new classes, in the absence of the initial training data. They suffer from "catastrophic forgetting" - an abrupt degradation of performance on the original set of classes, when the training objective is adapted to the new classes. We present a method to address this issue, and learn object detectors incrementally, when neither the original training data nor annotations for the original classes in the new training set are available. The core of our proposed solution is a loss function to balance the interplay between predictions on the new classes and a new distillation loss which minimizes the discrepancy between responses for old classes from the original and the updated networks. This incremental learning can be performed multiple times, for a new set of classes in each step, with a moderate drop in performance compared to the baseline network trained on the ensemble of data. We present object detection results on the PASCAL VOC 2007 and COCO datasets, along with a detailed empirical analysis of the approach.

Keywords:  
Incremental Learning
Object Detectors
Catastrophic Forgetting
Machine Learning
Deep Learning

Author(s) Name:  Konstantin Shmelkov, Cordelia Schmid, Karteek Alahari

Journal name:  Computer Science

Conferrence name:  

Publisher name:  arXiv:1708.06977

DOI:  10.48550/arXiv.1708.06977

Volume Information: