Feature importance is based on Shapley values, which estimate the influence of features on a model's prediction.
Note: because you are using k-fold cross-testing, feature importance is computed on the full dataset.
Note: feature importance is computed on the test dataset.
Absolute feature importance is the average of absolute Shapley values computed for each feature.
Note: the most important 20 features represent {{roundedPercentageOfTotalmportance}}% of the total feature importance
Feature effects displays multiple Shapley values computed per feature.
The Shapley values (x-axis) display the relative impact of the feature value (color) on the record's prediction
A positive Shapley value means this feature tended to push the prediction towards {{getTargetVariable()}}={{modelData.classes[1]}}
Feature dependence displays the distribution of Shapley values across different values of a feature.