Short Definition
The bias–variance tradeoff balances underfitting and overfitting.
Definition
The bias–variance tradeoff describes how model errors arise from two competing sources. High-bias models are too simple and fail to capture patterns, while high-variance models are too complex and fit noise.
Effective model design seeks a balance that minimizes total generalization error.
Why It Matters
Understanding this tradeoff guides decisions about model size, training duration, and regularization.
How It Works (Conceptually)
- High bias → underfitting
- High variance → overfitting
- Optimal balance → best generalization
Minimal Python Example
total_error = bias_error + variance_error
Common Pitfalls
- Increasing complexity without more data
- Misdiagnosing training problems
- Ignoring validation curves
Related Concepts
- Overfitting
- Regularization
- Model Capacity