stackgbm offers a minimalist, research-oriented implementation of model stacking (Wolpert, 1992) for gradient boosted tree models built by xgboost (Chen and Guestrin, 2016), lightgbm (Ke et al., 2017), and catboost (Prokhorenkova et al., 2018).
The easiest way to get stackgbm is to install from CRAN:
install.packages("stackgbm")
Alternatively, to use a new feature or get a bug fix, you can install the development version of stackgbm from GitHub:
# install.packages("remotes")
remotes::install_github("nanxstats/stackgbm")
To install all potential dependencies, check out the instructions from manage dependencies.
stackgbm implements a classic two-layer stacking model: the first layer generates "features" produced by gradient boosting trees. The second layer is a logistic regression that uses these features as inputs.
For a more comprehensive and flexible implementation of model stacking, see stacks in tidymodels, mlr3pipelines in mlr3, and StackingClassifier in scikit-learn.
Please note that the stackgbm project is released with a Contributor Code of Conduct. By contributing to this project, you agree to abide by its terms.