Short Definition
Overfitting occurs when a model memorizes training data.
Definition
An overfitted model performs well on training data but poorly on unseen data. This happens when the model captures noise instead of general patterns.
Overfitting is a symptom of excessive model capacity, insufficient data, or overly long training.
Why It Matters
Models that overfit fail in real-world scenarios.
How It Works (Conceptually)
- Training loss decreases
- Validation loss increases
- Generalization degrades
Minimal Python Example
# training_loss ↓ while validation_loss ↑
Common Pitfalls
- Ignoring validation data
- Overly complex models
- Training indefinitely
Related Concepts
- Regularization
- Bias–Variance Tradeoff
- Early Stopping