SA
Skip to main content

Catastrophic Forgetting

Catastrophic forgetting, also known as catastrophic interference, is a phenomenon that occurs in machine learning when a model forgets previously learned information as it learns new information. It can also be described as a failure of stability, in which new experience overwrites previous experience.

Catastrophic forgetting can occur when:

  • A neural network or machine learning model "forgets" or dramatically reduces its performance on previously learned tasks after learning a new task
  • An artificial neural network abruptly and drastically forgets previously learned information upon learning new information
  • A network decides to work with data that's too far removed from its basic training
  • A model concentrates only on task B and takes steps in the direction of its gradient
  • The ability to discriminate between data from different tasks worsens

Current approaches that deal with forgetting ignore the problem of catastrophic remembering.