Machine Learning

Batch Size

The number of training examples processed together before the model updates its parameters. Batch size affects training speed, memory usage, and how smoothly the model learns.

Why It Matters

Batch size is a critical trade-off between training speed, memory constraints, and model quality. Larger batches train faster but may generalize less well.

Example

Processing 32 images at a time through a neural network, computing the average error across all 32, then updating the weights once — that is a batch size of 32.

Think of it like...

Like grading papers — you could adjust your rubric after each paper (batch size 1) or after reading the whole stack (full batch), each approach giving different feedback quality.

Related Terms