Bagging Bootstrap Aggregating
Definition: A machine learning ensemble technique training multiple models on random subsets of data to improve stability and accuracy.
Bootstrap Aggregating (Bagging)
If you have one expert, they might be biased. If you have 50 experts and take the majority vote, you get a better answer. That’s Bagging. (Random Forest is the most famous example).
Vibe Coding with “Ensembles”
You can apply Bagging principles to LLM coding.
- Scenario: You have a critical SQL query to write.
- The Bagging Strategy:
- Open 3 different chats (or use 3 different models: GPT-4, Claude 3.5, Gemini Pro).
- Give them all the same prompt.
- Compare the results.
- The “Common Code” between them is likely the correct logic. The differences are the hallucinations.
Self-Consistency
This is a formal prompting technique called Self-Consistency.
- Prompt: “Generate 3 different ways to solve this problem. Then analyze the pros and cons of each and pick the best one.”
- Result: This forces the model to “ensemble” its own internal reasoning paths, leading to much more robust code than a single “zero-shot” answer.
