Machine Learning

Catastrophic Forgetting

The tendency of neural networks to completely forget previously learned information when trained on new data or tasks. New learning overwrites old knowledge.

Why It Matters

Catastrophic forgetting is why you cannot simply fine-tune a model on new data without risking degradation on previous tasks. It is a fundamental challenge in AI.

Example

A model fine-tuned on medical text that forgets how to write code — the new training overwrote the neural pathways responsible for programming knowledge.

Think of it like...

Like a student who crams for a biology exam and completely forgets everything they studied for last week's history exam — new learning erases old learning.

Related Terms