简体   繁体   中英

How can I get input gradient in Keras?

I'm building a custom neural network implementation. I use Keras for testing to make sure that gradients computed by my implementation match Keras gradients. Thanks to this answer How to obtain the gradients in keras? I was able to compare weights and outputs gradients. However I would also like to compare gradients for INPUTS. My Keras model is just one dense layer.

model = Sequential()
model.add(Dense(output_size,
                use_bias=bias,
                input_shape=(input_size,),
                activation=activation_name))
model.compile(optimizer="sgd", loss=loss_function_name)

...

model.evaluate(x, y)

How can I get gradients in respect to x ?

It seems like you want to get the gradient of the loss function with respect to the input (not the weights, as is usually the case). You can use tf.GradientTape() to achieve this particular task. Here is a sample implementation I referenced from a TensorFlow tutorial , with minimal edits to code to suit your situation:

loss_object = tf.keras.losses.CategoricalCrossentropy() # Can be any loss function

model = tf. keras.applications.MobileNetV2(include_top=True, weights='imagenet') # Can be any model

def compute_gradient(input, input_label):
  with tf.GradientTape() as tape:
    tape.watch(input)
    prediction = model(input)
    loss = loss_object(input_label, prediction)
  gradient = tape.gradient(loss, input)
  return gradient

For more information on how to use tf.GradientTape() , refer to the official documentation .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM