shapper icon indicating copy to clipboard operation
shapper copied to clipboard

SHAP Feature Importance

Open Steviey opened this issue 5 years ago • 1 comments

Hi there,

is there any elaborated way to obtain SHAP Feature Importance using shapper?

Reading this https://christophm.github.io/interpretable-ml-book/shap.html#shap-feature-importance

...I would guess, doing a loop over "shapper::individual_variable_effect" and mean() the results of attributions per vname could do the trick.

Am I wrong?

Is there any plan to integrate the original functions, like summary_plot to obtain SHAP feature importance?

By the way, when I try to feed the function individual_variable_effect with multiple new observations new_observation = testX[1:5, ] I get errors.

Error in $<-.data.frame(tmp, "_attribution_", value = c(0, -0.365675633989662, : replacement has 140 rows, data has 70

Steviey avatar Feb 28 '20 13:02 Steviey

Hello, I needed just this functionality for a university project and implemented it here: #26 . Additionally, to cope with larger data sets, I implemented the kmeans function of the SHAP Python lib to help summarize data instances.

stereolith avatar Mar 03 '20 18:03 stereolith