Elastic Weight Consolidation
A technique for continual learning that identifies which weights are important for previously learned tasks and penalizes changes to those weights during new learning.
Why It Matters
EWC is one of the most effective approaches to overcoming catastrophic forgetting, enabling models to learn new tasks without losing old abilities.
Example
Training a model on task B while adding a penalty that prevents the most important weights for task A from changing significantly.
Think of it like...
Like protecting the load-bearing walls when renovating a house — you can change everything else, but the critical structural elements must remain intact.
Related Terms
Catastrophic Forgetting
The tendency of neural networks to completely forget previously learned information when trained on new data or tasks. New learning overwrites old knowledge.
Continual Learning
Training a model on new data or tasks over time without forgetting previously learned knowledge. Also called lifelong learning or incremental learning.
Regularization
Techniques used to prevent overfitting by adding constraints or penalties to the model during training. Regularization discourages the model from becoming too complex or fitting noise in the training data.
Weight
A numerical parameter in a neural network that is learned during training. Weights determine the strength of connections between neurons and collectively encode the model's knowledge.