Batch Normalization
Definition: Normalizing input or output of activation functions in hidden layers to stabilize training and reduce overfitting.
Batch Normalization: The Stabilizer What it does
Neural networks get “moody” if the numbers flowing through them get too big or too small (Exploding/Vanishing Gradients). Batch Normalization forces the numbers to stay in a nice, standard range (mean 0, variance 1) at every layer. It makes training faster and more stable.
The Vibe Analogy: “Standardizing Inputs”
In Vibe Coding, your “inputs” are your prompts and context files.
- Un-normalized Context: One file has tabs, one has spaces. One uses
camelCase, one usessnake_case. The prompt is screaming in ALL CAPS.- Result: The AI gets confused and outputs unstable code.
- Normalized Context: You use a linter (Prettier/Black) on your code before pasting it to the AI. You use a structured prompt template.
- Result: The AI recognizes the pattern immediately and outputs stable, high-quality code.
“Normalization” of Prompts
Think of your system prompt as a Batch Norm layer.
“You are a helpful assistant. Always output JSON. Always use ISO dates.” This “normalizes” the output distribution, ensuring that no matter what crazy user input comes in, the output remains in the “stable range” (valid JSON).
Expert Tip
If the AI is hallucinating or acting weird, “Normalize” the conversation. Summarize the current state, clear the history, and restate the goal clearly. Stabilize the vibe.
