Continual Learning
Training a model on new data or tasks over time without forgetting previously learned knowledge. Also called lifelong learning or incremental learning.
Why It Matters
Continual learning is essential for production AI that must adapt to changing conditions without periodic full retraining, which is expensive and slow.
Example
A fraud detection model that can learn to recognize new fraud patterns as they emerge without losing its ability to detect older fraud techniques.
Think of it like...
Like a doctor who keeps learning about new diseases and treatments throughout their career without forgetting what they learned in medical school.
Related Terms
Catastrophic Forgetting
The tendency of neural networks to completely forget previously learned information when trained on new data or tasks. New learning overwrites old knowledge.
Online Learning
A training paradigm where the model updates continuously as new data arrives, one example at a time (or in small batches), rather than training on a fixed dataset.
Transfer Learning
A technique where a model trained on one task is repurposed as the starting point for a model on a different but related task. Instead of training from scratch, you leverage knowledge the model has already acquired.
Model Monitoring
The practice of continuously tracking an ML model's performance, predictions, and input data in production to detect degradation, drift, or anomalies after deployment.