简体   繁体   English

决策树和特征重要性:为什么决策树没有显示所有变量的重要性?

[英]Decision Tree and Feature Importance: Why does the decision tree not show the importance of all variables?

I have run a decsision tree with 62 idependent variables to predict stock prices.我运行了一个包含 62 个独立变量的决策树来预测股票价格。 However, when extracting the feature importance with classifier_DT_tuned$variable.importance , I only see the importance of 55 and not 62 variables.但是,当使用classifier_DT_tuned$variable.importance提取特征重要性时,我只看到 55 个变量的重要性,而不是 62 个变量。

I would have expected that the decision tree picks up the most important variables but then would assign a 0.00 in importance to the not used ones.我原以为决策树会选择最重要的变量,但随后会为未使用的变量分配 0.00 的重要性。 Could you please help me out and elaborate on this issue?你能帮我解释一下这个问题吗? Thanks!谢谢!

Did you try getting the feature importance like below:您是否尝试过获得如下特征重要性:

feat_importance = list(dt_clf.tree_.compute_feature_importances())

This will give you the list of importance for all the 62 features/variables.这将为您提供所有 62 个功能/变量的重要性列表。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM