Ilya Matiach
Ilya Matiach
perf: reduce lightgbm prediction time by 20% depends on native changes: https://github.com/microsoft/LightGBM/pull/3159 In testing, time went from 82159416923ns to 65852779813ns on the pima dataset (test added in PR but num...
TreeExplainer on LightGBMClassifier returns 2D array of shap values in binary classification case
For the binary classification case, when using TreeExplainer with scikit-learn the shap values are in a 3D array where the 1st dimension is the class, the 2nd dimension rows and...
## Description Add UI tests for the new RAI text dashboard, specifically the individual feature importances for text component. This includes testing the: 1.) TextHighlighting component, which displays the text...
**Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] Add UI tests for...
In RAI dashboard, when selecting two datapoints in the feature importance plot, the "sort by absolute" toggle is on in the UI, but the chart doesn't actually sort by absolute...
From user issue: https://github.com/interpretml/interpret-community/issues/465 Which belongs in this repo instead: ``` I think sometimes it'd be really useful to allow to show the Dependence plot using the log scale for...
One user commented that they would like to see the initial paths in the tree view tool. When the tool first loads, it would be useful to see the tree...
Using importances for the values in the features panel is a bit confusing. These values are calculated by mutual information and are more like a correlation with the error. Perhaps...
The feature_dependence parameter to the LinearExplainer has been deprecated for several years now in shap and it has been causing confusing warnings in our notebooks. This PR removes the parameter...