You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am training several ML models using the mlr3 package and have been using iml to retrieve permutation importance for variables in my data. However, I have noticed that, for BART models, the variable importance is exactly the same for every variable. Below is code to reproduce this issue using a toy dataset. Even the variable which is completely unrelated to the outcome variable has the same variable importance as the others.
One update: this seems to only be an issue when operating in parallel with the BART model. I do not obtain the same importance for all variables when using other models (e.g., Ranger, xgboost, GBM, etc.) in parallel or when using BART with sequential variable importance calculation. I have modified the code such that it will reproduce this issue below:
I am training several ML models using the mlr3 package and have been using iml to retrieve permutation importance for variables in my data. However, I have noticed that, for BART models, the variable importance is exactly the same for every variable. Below is code to reproduce this issue using a toy dataset. Even the variable which is completely unrelated to the outcome variable has the same variable importance as the others.
Output:
The text was updated successfully, but these errors were encountered: