Early stopping

Short Definition

Early stopping halts training when validation performance degrades.

Definition

Early stopping is a training control technique that stops optimization once the model’s performance on validation data stops improving. Rather than training for a fixed number of epochs, the process is terminated based on observed generalization behavior.

This prevents the model from continuing to fit noise in the training data.

Why It Matters

Early stopping is one of the simplest and most effective defenses against overfitting.

How It Works (Conceptually)

  • Monitor validation loss or metric
  • Track best-performing model
  • Stop after patience threshold is exceeded

Minimal Python Example

if val_loss > best_loss:
patience_counter += 1

Common Pitfalls

  • Monitoring training loss instead of validation loss
  • Using very small patience values
  • Stopping before convergence

Related Concepts

  • Overfitting
  • Validation Set
  • Training Loop