Papers
arxiv:1612.00796

Overcoming catastrophic forgetting in neural networks

Published on Dec 2, 2016
Authors:
,
,
,
,
,
,
,
,
,
,
,
,
,

Abstract

The ability to learn tasks in a sequential fashion is crucial to the development of artificial intelligence. Neural networks are not, in general, capable of this and it has been widely thought that catastrophic forgetting is an inevitable feature of connectionist models. We show that it is possible to overcome this limitation and train networks that can maintain expertise on tasks which they have not experienced for a long time. Our approach remembers old tasks by selectively slowing down learning on the weights important for those tasks. We demonstrate our approach is scalable and effective by solving a set of classification tasks based on the MNIST hand written digit dataset and by learning several Atari 2600 games sequentially.

Community

We have made an open-source implementation of EWC (Elastic Weight Consolidation) available in the adaptive-classifier library here - https://github.com/codelion/adaptive-classifier/blob/main/src/adaptive_classifier/ewc.py

Sign up or log in to comment

Models citing this paper 7

Browse 7 models citing this paper

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/1612.00796 in a dataset README.md to link it from this page.

Spaces citing this paper 9

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.