Back to Course
Machine Learning: The Engineering Approach
Module 6 of 13
6. Gradient Boosting
1. Weak Learners
A single Decision Tree is weak. 1000 Decision Trees voting together (Random Forest) are strong.
2. XGBoost / LightGBM
Instead of independent trees, each new tree fixes the errors of the previous one. This is the state-of-the-art for Tabular Data.
pythonimport xgboost as xgb model = xgb.XGBRegressor() model.fit(X, y)