Short Definition
Transfer learning reuses knowledge from a pretrained model.
Definition
Transfer learning leverages models trained on large datasets to solve related tasks with less data. Instead of learning features from scratch, the model starts from weights that already encode useful representations.
This approach is widely used in vision, language, and speech tasks.
Why It Matters
Transfer learning dramatically reduces training time and data requirements.
How It Works (Conceptually)
- Start from a pretrained model
- Freeze early layers
- Fine-tune later layers or task-specific heads
Minimal Python Example
for layer in frozen_layers: layer.trainable = False
Common Pitfalls
- Using unrelated source tasks
- Freezing too many layers
- Overfitting small datasets
Related Concepts
- Fine-Tuning
- Pretrained Models
- Generalization