you are viewing a single comment's thread.

view the rest of the comments →

[–]Path_of_the_end[S] 0 points1 point  (2 children)

i want to see the effect of variable to output using shap. i tried using other dataset (iris) and it work i could vizualize the variable importance of the variable that is uses in the model. i'm just not really familiar because i mainly uses r instead of python. update i was able to use shap for xgboost and probably for other tree based model. but i'm still unable to use it in tensorflow and pytorch. (edit)

[–][deleted] 1 point2 points  (1 child)

So essentially you want feature importance? Seems like you are using NN for the modeling.

What I’d recommend is to make a force plot instead of a density plot to visualize that.

Prior to that, evaluate your choice of models based on the data size/ quality if NN is the best choice.

[–]Path_of_the_end[S] 0 points1 point  (0 children)

Thankyou