Decision Tree

Decision Tree is a simple non-parametric algorithm. It creates a model that predicts the value of the {{ isCausalPrediction() | targetRoleName}} by learning simple decision rules inferred from the data features.

These rules form a tree, with the leaves of the tree carrying the predicted class. Evaluation simply goes down the tree and evaluates the rule at each split.

The function to measure the quality of a split. Supported criteria are “gini” for the Gini impurity and “entropy” for the information gain.
The strategy used to choose the split at each node. Supported strategies are “best” to choose the best split and “random” to choose the best random split.