Amazing technological breakthrough possible @S-Logix pro@slogix.in

Office Address

  • #5, First Floor, 4th Street Dr. Subbarayan Nagar Kodambakkam, Chennai-600 024 Landmark : Samiyar Madam
  • pro@slogix.in
  • +91- 81240 01111

Social List

Toward training recurrent neural networks for lifelong learning - 2020

Toward Training Recurrent Neural Networks For Lifelong Learning

Research Area:  Machine Learning

Abstract:

Catastrophic forgetting and capacity saturation are the central challenges of any parametric lifelong learning system. In this work, we study these challenges in the context of sequential supervised learning with an emphasis on recurrent neural networks. To evaluate the models in the lifelong learning setting, we propose a curriculum-based, simple, and intuitive benchmark where the models are trained on tasks with increasing levels of difficulty. To measure the impact of catastrophic forgetting, the model is tested on all the previous tasks as it completes any task. As a step toward developing true lifelong learning systems, we unify gradient episodic memory (a catastrophic forgetting alleviation approach) and Net2Net (a capacity expansion approach). Both models are proposed in the context of feedforward networks, and we evaluate the feasibility of using them for recurrent networks. Evaluation on the proposed benchmark shows that the unified model is more suitable than the constituent models for lifelong learning setting.

Keywords:  

Author(s) Name:  Shagun Sodhani, Sarath Chandar, Yoshua Bengio

Journal name:  Neural Computation

Conferrence name:  

Publisher name:  MIT PRESS

DOI:  10.1162/neco_a_01246

Volume Information:  (2020),Volume 32,Issue (1),Pages: 1–35.