Short Definition
A loss function measures how wrong a model’s predictions are.
Definition
A loss function quantifies the difference between a model’s prediction and the true target value. It converts model performance into a single numerical value that training algorithms can optimize.
Different tasks require different loss functions, depending on whether the goal is regression, classification, or ranking.
Why It Matters
The loss function defines what “better” means during training.
How It Works (Conceptually)
- Compare prediction with target
- Compute a scalar error value
- Lower loss indicates better performance
Minimal Python Example
Python
def mean_squared_error(y_true, y_pred): return (y_true - y_pred) ** 2
Common Pitfalls
- Confusing loss with accuracy
- Using an inappropriate loss
- Ignoring loss scale
Related Concepts
- Gradient Descent
- Backpropagation
- Optimization