Batch Normalization
Definition: Normalizing input or output of activation functions in hidden layers to stabilize training and reduce overfitting. Batch Normalization: The Stabilizer What it does Neural networks get “moody” if the numbers flowing through them get too big or too small (Exploding/Vanishing Gradients). Batch Normalization forces the numbers to stay in a nice, standard range (mean…
