Malek Baba

Results 10 comments of Malek Baba

İ thought about using some computer vision models like Yolo3 , to segment the FAQ section on the page to question and its answer, but I'm not sure is it...

Here is an example of a config file: ``` /inputApk/ /sourceSinks.txt /platforms /outAnalysis true NoMatch true SourceListOnly 300 true NoImplicitFlows false false false false Fast 1 true 300 300 ```...

Hi, I want to work on this feature, and I want to get more info on it. By the way, I'm new to the open-source world, so please bear with...

As far as I found, there is another technique besides the SHAP values, named LIME for local interpretability. It is implemented in https://github.com/interpretml/interpret package. LIME stands to be faster, than...

Sure. I will start to work on it.

Hi, when I tried to get the local explanations, I noticed a kind of bug in the predict function of the AutoML object, when you provide it with numpy.ndarray instead...

Yeah, LIME actually needs a preprocessed data to do the explanations, and at the same time, it needs it as a NumPy array. More than that, it uses the predict()...

Sorry for my late reply due to my other responsibilities, I managed to use the LIME library on XGBoost and Random forest trees, However, on the Catboost algorithms, I couldn't....

I think since we have a problem with LIME integration, I will try to work on providing explanations using the SHAP library instead.

hi, i managed to get SHAP single predictions on models like Catboost, and attached the used collab here. By the way, I noticed an issue regarding to Catboost shap prediction...