I would like to know how to get tf.gradients
from a model built using the Keras API.
import Tensorflow as tf
from tensorflow import keras
from sklearn.datasets.samples_generator import make_blobs
# Create the model
inputs = keras.Input(shape=(2,))
x = keras.layers.Dense(12, activation='relu')(inputs)
x = keras.layers.Dense(8, activation='relu')(x)
predictions = keras.layers.Dense(3, activation='softmax')(x)
model = keras.models.Model(inputs=inputs, outputs=predictions)
model.compile(optimizer=tf.train.AdamOptimizer(0.001),
loss='categorical_crossentropy',
metrics=['accuracy'])
# Generate random data
X, y = make_blobs(n_samples=1000, centers=3, n_features=2)
labels = keras.utils.to_categorical(y, num_classes=3)
# Compute the gradients wrt inputs
y_true = tf.convert_to_tensor(labels)
y_pred = tf.convert_to_tensor(np.round(model.predict(X)))
sess = tf.Session()
sess.run(tf.global_variables_initializer())
grads = tf.gradients(model.loss_functions[0](y_true, y_pred),
model.inputs[0])
sess.run(grads, input_dict={model.inputs[0]: X, model.outputs: y})
First attempt above: my grads are None
. With my second try below:
sess.run(grads, input_dict={model.inputs: X, model.outputs: y })
I get the following error:
TypeError: unhashable type: 'list'
I think you shouldn't create a new session directly with Tensorflow when using Keras. Instead, it is better to use the session implicitly created by Keras:
import keras.backend as K
sess = K.get_session()
However, I think in this case you don't need to retrieve the session at all. You can easily use backend functions, like K.gradients()
and K.function()
, to achieve your aim.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.