LightGBM

LightGBM is a tree-based gradient boosting library designed to be distributed and efficient. This algorithm provides fast training speed, low memory usage, good accuracy and is capable of handling large scale data.

For more information on gradient tree boosting, see the "Gradient tree boosting" algorithm.

"Gradient-based One-Side Sampling" (GOSS) is a variant of "Gradient Boosting Decision Tree" (GBDT) where, at each step of the training phase, all of the data with a large gradient is kept while random sampling is performed on the data with a small gradient.
Maximum depth of each tree. High values can increase the quality of the prediction, but can also lead to over-fitting. Typical values: 3 - 10. Set to -1 to unconstrain.
Bagging can be used to speed up training and/or deal with over-fitting. Please note that Gradient-based One-Side Sampling does not support bagging. As a consequence, bagging parameters will be ignored for this boosting type.
Subsample ratio of the training instance.
Frequency of subsample. <= 0 means no enable
Use LightGBM built-in early stop mechanism so the exact number of trees will be optimized.
The cross-validation scheme defined in the "Hyperparameters" tab will be used.
Early stopping is not available for causal prediction.
The optimizer stops if the loss never decreases for this consecutive number of iterations. Typical values: 1 - 10
Using a fixed random seed allows for reproducible result
Number of cores used for parallel training. Using more cores leads to faster training but at the expense of more memory consumption, especially for large training datasets.
Allow DSS to use sparse matrices to train the model
This may help reduce RAM and CPU usage