繁体   English   中英

SHAP ValueError:两个形状中的维度 1 必须相等,但为 2 和 1。形状为 [?,2] 和 [?,1]

[英]SHAP ValueError: Dimension 1 in both shapes must be equal, but are 2 and 1. Shapes are [?,2] and [?,1]

基于之前训练的前馈网络,我尝试使用 SHAP 来获取特征重要性。 我遵循了文档中描述的所有步骤,但仍然收到以下错误

ValueError: Dimension 1 in both shapes must be equal, but are 2 and 1. Shapes are [?,2] and [?,1]

以下代码生成具有相同错误的可重复示例。

import pandas as pd
from numpy.random import randint
from keras.utils import to_categorical
from keras.models import Sequential
from keras.layers import Dense, BatchNormalization, Dropout, Activation
from keras.optimizers import Adam
import shap

# Train_x data creation
train_x = pd.DataFrame({
    'v1': randint(2, 20, 1489),
    'v2': randint(50, 200, 1489),
    'v3': randint(30, 90, 1489),
    'v4': randint(100, 150, 1489)
})

# Train_y data creation
train_y = randint(0, 2, 1489)

# One-hot encoding as I use categorical cross-entropy
train_y = to_categorical(train_y, num_classes=2)

# Start construction of a DNN Sequential model.
model = Sequential()

# First input layer with a dropout and batch normalization layer following
model.add(Dense(256, input_dim=train_x.shape[1]))
model.add(BatchNormalization())
model.add(Activation('relu'))
model.add(Dropout(rate=0.2))

# Output layer
model.add(Dense(2))
model.add(Activation('softmax'))

# Use the Adam optimizer
optimizer = Adam(lr=0.001)

# Compile the model
model.compile(loss='categorical_crossentropy', optimizer=optimizer, metrics=['accuracy'])

model.summary()

# Fit model
hist = model.fit(train_x, train_y, epochs=100, batch_size=128, shuffle=False, verbose=2)

# SHAP calculation
explainer = shap.DeepExplainer(model, train_x)
shap_values = explainer.shap_values(train_x[:500].values)

其中我的输入形状为(None, 4)和一个 softmax 激活 function 最后有 2 个神经元,因为我将它用于二进制分类。 以下代码片段中的train_x数据是形状为(1489, 4)的 pandas 数据帧。

我试图改变train_x形状,但我遇到了类似的错误。 任何帮助将非常感激。

请参阅下面的 TF 二进制分类的工作示例:

from numpy.random import randint
from tensorflow.keras.utils import to_categorical
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, BatchNormalization, Dropout, Activation
from tensorflow.keras.optimizers import Adam
import shap
import tensorflow
print(shap.__version__, "\n",tensorflow.__version__)

# Train_x data creation
train_x = pd.DataFrame({
    'v1': randint(2, 20, 1489),
    'v2': randint(50, 200, 1489),
    'v3': randint(30, 90, 1489),
    'v4': randint(100, 150, 1489)
})

# Train_y data creation
train_y = randint(0, 2, 1489)

# One-hot encoding as I use categorical cross-entropy
train_y = to_categorical(train_y, num_classes=2)

# Start construction of a DNN Sequential model.
model = Sequential()

# First input layer with a dropout and batch normalization layer following
model.add(Dense(256, input_dim=train_x.shape[1]))
model.add(BatchNormalization())
model.add(Activation('relu'))
model.add(Dropout(rate=0.2))

# Output layer
model.add(Dense(2))
model.add(Activation('softmax'))

# Use the Adam optimizer
optimizer = Adam(lr=0.001)

# Compile the model
model.compile(loss='categorical_crossentropy', optimizer=optimizer, metrics=['accuracy'])

# model.summary()

# Fit model
hist = model.fit(train_x, train_y, epochs=100, batch_size=128, shuffle=False, verbose=0)

# SHAP calculation
shap.explainers._deep.deep_tf.op_handlers["AddV2"] = shap.explainers._deep.deep_tf.passthrough
explainer = shap.DeepExplainer(model, train_x)

shap_values = explainer.shap_values(train_x[:500].values)

shap.summary_plot(shap_values[1])

0.38.2 
 2.2.0

在此处输入图像描述

注意几件事:

  1. Package 版本(我相信 tf 应该低于 2.4)
  2. 添加"AddV2" (参见此处的讨论)

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM