Few-Shot Learning
A technique where a model learns to perform a task from only a few examples provided in the prompt. Instead of training on thousands of examples, the model generalizes from just 2-5 demonstrations.
Why It Matters
Few-shot learning makes AI accessible without massive datasets or fine-tuning. It enables rapid prototyping and adaptation to new tasks with minimal effort.
Example
Showing an LLM three examples of converting casual text to formal text, then asking it to convert a fourth — the model learns the pattern from just those examples.
Think of it like...
Like a quick learner who watches a chef make a dish three times and can then reproduce it — they grasp the pattern quickly from minimal demonstration.
Related Terms
Zero-Shot Learning
A model's ability to perform a task it was never explicitly trained on or shown examples of. The model applies its general knowledge and reasoning to handle entirely new task types.
In-Context Learning
An LLM's ability to learn new tasks from examples or instructions provided within the prompt, without any weight updates or fine-tuning. The model adapts its behavior based on the context given.
Prompt Engineering
The practice of designing and optimizing input prompts to get the best possible output from AI models. It involves crafting instructions, providing examples, and structuring queries to guide the model toward desired responses.
Meta-Learning
An approach where models 'learn to learn' — they are trained across many tasks so they can quickly adapt to new tasks with minimal data. Also called learning to learn.