AdaBoost
AdaBoost, short for Adaptive Boosting, is a technique that combines multiple weak learners (simple models) to create a strong overall model. Hereβs how it works in simple steps:
- Start with Equal Weights: All data points are initially given equal importance (weight).
- Train a Weak Learner: Train a simple model (like a decision stump) on the data.
- Calculate Errors: See which data points are misclassified by the model.
- Update Weights: Increase the weights of the misclassified data points so that the next model focuses more on them.
- Repeat: Train another weak learner on the updated weights and repeat the process.
- Combine Models: The final prediction is a weighted vote of all the weak learners.