Feature Scaling

Short Definition

Feature scaling rescales input features to comparable ranges.

Definition

Feature scaling ensures that input features contribute proportionally during training. Without scaling, features with large numeric ranges can dominate gradient updates and destabilize optimization.

Common scaling methods include normalization and standardization.

Why It Matters

Well-scaled inputs lead to faster and more stable training.

How It Works (Conceptually)

  • Rescale features to fixed ranges
  • Prevent dominance by large-valued inputs
  • Improve gradient behavior

Minimal Python Example

scaled = (x - min_val) / (max_val - min_val)

Common Pitfalls

  • Scaling test data incorrectly
  • Mixing scaling strategies
  • Ignoring feature distributions

Related Concepts

  • Normalization
  • Input Preprocessing
  • Training Stability