This is my model. Im using Tensorflow 2.4.1.
model = tf.keras.Sequential([
tf.keras.layers.Embedding(input_dim=1000,
output_dim=64,
name='embedding',
mask_zero=True),
tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(32)),
tf.keras.layers.Dense(16, activation='relu'),
tf.keras.layers.Dense(4, name='logits')
])
metrics = [tf.keras.metrics.SparseCategoricalAccuracy()]
# compile the model
model.compile(optimizer='adam',
loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
metrics=metrics)
when I run the following code, I get None
as gradient wrt input.
def compute_gradients(t, target_class_idx):
with tf.GradientTape() as tape:
tape.watch(t)
logits = model(t)
probs = tf.nn.softmax(logits, axis=-1)[:, target_class_idx]
grads = tape.gradient(probs, t)
return grads
Here is a sample input, and the call
sample_tensor = tf.random.uniform(shape=(1, 50))
path_gradients = compute_gradients(
t=sample_tensor,
target_class_idx=0)
print(path_gradients)
None
What am I doing wrong ?
Thanks
The Embedding
layer in TensorFlow is not differentiable. Source: https://github.com/keras-team/keras/issues/12270
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.