Overcoming catastrophic forgetting with hard attention to the task

01/04/2018
by   Joan Serrà, et al.
0

Catastrophic forgetting occurs when a neural network loses the information learned with the first task, after training on a second task. This problem remains a hurdle for general artificial intelligence systems with sequential learning capabilities. In this paper, we propose a task-based hard attention mechanism that preserves previous tasks' information without substantially affecting the current task's learning. An attention mask is learned concurrently to every task through stochastic gradient descent, and previous masks are exploited to constrain such learning. We show that the proposed mechanism is effective for reducing catastrophic forgetting, cutting current rates by 33 to 84 choices and that it offers a number of monitoring capabilities. The approach features the possibility to control both the stability and compactness of the learned knowledge, which we believe makes it also attractive for online learning and network compression applications.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset