Federated Learning
A decentralized training approach where a model is trained across multiple devices or organizations without sharing raw data. Each participant trains locally and only shares model updates.
Why It Matters
Federated learning enables AI training on sensitive data (medical, financial) without compromising privacy. It unlocks collaboration between organizations that cannot share data.
Example
Multiple hospitals training a cancer detection model together — each hospital trains on its own patient data locally and only shares the model weight updates, never the patient records.
Think of it like...
Like a group of chefs developing a recipe together by sharing cooking tips but never revealing their secret ingredients — the recipe improves without exposing proprietary information.
Related Terms
Differential Privacy
A mathematical framework that provides provable privacy guarantees when analyzing or learning from data. It ensures that the output of any analysis is approximately the same whether or not any individual's data is included.
Distributed Training
Splitting model training across multiple GPUs or machines to handle larger models or datasets and reduce training time. Techniques include data parallelism and model parallelism.
Data Privacy
The right of individuals to control how their personal information is collected, used, stored, and shared. In AI, data privacy concerns arise from training data, user interactions, and model outputs.