TY - JOUR
T1 - AddGBoost: A gradient boosting-style algorithm based on strong learners
AU - Sipper, Moshe
AU - Moore, Jason H.
PY - 2022/3
Y1 - 2022/3
N2 - We present AddGBoost, a gradient boosting-style algorithm, wherein the decision tree is replaced by a succession of (possibly) stronger learners, which are optimized via a state-of-the-art hyperparameter optimizer. Through experiments over 90 regression datasets we show that AddGBoost emerges as the top performer for 33% (with 2 stages) up to 42% (with 5 stages) of the datasets, when compared with seven well-known machine-learning algorithms: KernelRidge, LassoLars, SGDRegressor, LinearSVR, DecisionTreeRegressor, HistGradientBoostingRegressor, and LGBMRegressor.
AB - We present AddGBoost, a gradient boosting-style algorithm, wherein the decision tree is replaced by a succession of (possibly) stronger learners, which are optimized via a state-of-the-art hyperparameter optimizer. Through experiments over 90 regression datasets we show that AddGBoost emerges as the top performer for 33% (with 2 stages) up to 42% (with 5 stages) of the datasets, when compared with seven well-known machine-learning algorithms: KernelRidge, LassoLars, SGDRegressor, LinearSVR, DecisionTreeRegressor, HistGradientBoostingRegressor, and LGBMRegressor.
U2 - 10.1016/j.mlwa.2021.100243
DO - 10.1016/j.mlwa.2021.100243
M3 - Article
SN - 2666-8270
VL - 7
JO - Machine Learning with Applications
JF - Machine Learning with Applications
M1 - 100243
ER -