简体   繁体   中英

Feature Importance in (Caret) package

I applied four ML methods (Linear, XGBoost, RF, SVM) using the Caret package. I got the features importances using varImp function. caret::varImp(xgb1, scale = TRUE)

However, the sum of the features importances does not add to 1. I am wondering if I can use a different function to have all the features importances add to 1.

Thank you,

Absent a reproducible example, we'll use the vowel data from the Elements of Statistical Learning book to generate a random forest, and rescale the variable importance data so the sum is equal to 1 by dividing each variable importance number by the sum of all importance numbers.

library(readr)
vowel.train <- subset(read_csv("https://web.stanford.edu/~hastie/ElemStatLearn/datasets/vowel.train"),
                      select = -row.names) 
vowel.test <- subset(read_csv("https://web.stanford.edu/~hastie/ElemStatLearn/datasets/vowel.test"),
                     select = -row.names)
library(caret)
library(randomForest)
vowel.train$y <- as.factor(vowel.train$y)
vowel.test$y <- as.factor(vowel.test$y)
set.seed(33833)
tr1Control <- trainControl(method="boot")
modFit <- train(y ~ .,method="rf",trControl=tr1Control,data=vowel.train)
# Variable Importance: caret function, extract importance data frame & rescale
v <- varImp(modFit,scale = TRUE)[["importance"]]
v$Overall <- v$Overall / sum(v$Overall)
v

..and the output:

> v
         Overall
x.1  0.318660495
x.2  0.327734091
x.3  0.018931795
x.4  0.021533916
x.5  0.126744531
x.6  0.089627688
x.7  0.000000000
x.8  0.067066743
x.9  0.027072197
x.10 0.002628545

...and to demonstrate that sum(v$Overall) is now 1:

sum(v$Overall)

> sum(v$Overall)
[1] 1

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM