Machine Learning

Bias-Variance Tradeoff

The fundamental tension in ML between a model that is too simple (high bias, underfitting) and one that is too complex (high variance, overfitting). The goal is finding the sweet spot.

Why It Matters

Understanding the bias-variance tradeoff is essential for model selection and debugging. It explains why adding complexity does not always improve results.

Example

A linear model (high bias) that consistently predicts house prices $50K too low versus a complex model (high variance) that is sometimes $100K off in either direction.

Think of it like...

Like a golf swing — too rigid (high bias) and you consistently miss in one direction, too loose (high variance) and your shots scatter everywhere. The sweet spot is controlled flexibility.

Related Terms