Short Definition
The training loop controls the learning process.
Definition
The training loop orchestrates forward passes, loss computation, backpropagation, and parameter updates. It defines how data flows through the model during training and how learning progresses over time.
A correct training loop is essential for reliable results.
Why It Matters
Errors in the training loop can silently break learning.
How It Works (Conceptually)
- Forward pass
- Loss computation
- Backward pass
- Parameter update
Minimal Python Example
Python
for epoch in range(epochs): forward() backward() update()
Common Pitfalls
- Forgetting to reset gradients
- Mixing training and evaluation logic
- Logging misleading metrics
Related Concepts
- Epochs
- Backpropagation
- Optimization