Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Decision Tree Feature Importance #200

Open
cmougan opened this issue Jul 25, 2020 · 0 comments
Open

Decision Tree Feature Importance #200

cmougan opened this issue Jul 25, 2020 · 0 comments

Comments

@cmougan
Copy link

cmougan commented Jul 25, 2020

I believe that in section 4.4.1 (https://christophm.github.io/interpretable-ml-book/tree.html) you are just defining one type of feature importance, there are more types and maybe it can be interesting for readers to know about them.

In the xgboost API they define the following:
Get feature importance of each feature. Importance type can be defined as:

  • ‘weight’: the number of times a feature is used to split the data across all trees.
  • ‘gain’: the average gain across all splits the feature is used in.
  • ‘cover’: the average coverage across all splits the feature is used in.
  • ‘total_gain’: the total gain across all splits the feature is used in.
  • ‘total_cover’: the total coverage across all splits the feature is used in.

This can be considered as a more thorough extension.

PD. The book is really interesting thanks :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant