AdaBoost

AdaBoost, short for Adaptive Boosting, is a technique that combines multiple weak learners (simple models) to create a strong overall model. Here’s how it works in simple steps:

  1. Start with Equal Weights: All data points are initially given equal importance (weight).
  2. Train a Weak Learner: Train a simple model (like a decision stump) on the data.
  3. Calculate Errors: See which data points are misclassified by the model.
  4. Update Weights: Increase the weights of the misclassified data points so that the next model focuses more on them.
  5. Repeat: Train another weak learner on the updated weights and repeat the process.
  6. Combine Models: The final prediction is a weighted vote of all the weak learners.