Regularization & Generalization
Overfitting: model learns noise; low training loss, high validation loss.Underfitting: model too simple; high training and validation loss.Regularization: techniques to improve generalization:

L2(weight decay): penalize large weights.Early stopping: stop when validation loss worsens.Dropout: randomly drop units during training (simulated in code by masking).Data augmentation: alter inputs (flips/crops/noise) to create variety.


