Grid Search

Short Definition

Grid search exhaustively evaluates all combinations of specified hyperparameters.

Definition

Grid search is a hyperparameter optimization method that defines discrete values for each hyperparameter and evaluates every possible combination. It is simple, deterministic, and easy to implement but scales poorly as the number of hyperparameters grows.

Grid search is most useful for small, low-dimensional search spaces.

Why It Matters

Grid search provides a clear baseline for hyperparameter tuning and ensures complete coverage of a predefined search space.

It is often used for educational purposes or small experiments.

How It Works (Conceptually)

  • Specify discrete values for each hyperparameter
  • Construct the Cartesian product of all values
  • Train and evaluate a model for each combination
  • Select the best-performing configuration

Minimal Python Example


Python
for lr in [0.01, 0.1]:
for batch in [16, 32]:
train(lr, batch)

Common Pitfalls

  • Exponential growth of combinations
  • Wasted computation on unimportant dimensions
  • Assuming grid coverage implies optimality
  • Using grid search for large models

Related Concepts

  • Hyperparameter Optimization
  • Hyperparameters
  • Random Search
  • Training Cost
  • Validation Metrics