简体   繁体   English

Keras梯度WRT输入可用于多个输出尺寸

[英]Keras gradient wrt input for multiple output dimensions

I have a keras model with a two-dimensional output (binary classification). 我有一个带有二维输出(二进制分类)的keras模型。

model.output # <tf.Tensor 'dense_1_3/MatMul:0' shape=(?, 2) dtype=float32>

and

model.input # <tf.Tensor 'bidirectional_1_input:0' shape=(?, ?, 200) dtype=float32>

I evaluated three different gradients for some example input of shape (1,50,200) 我为形状的一些示例输入(1,50,200)评估了三种不同的渐变

gradients0 = K.gradients(model.output[:,0] model.inputs)
gradients1 = K.gradients(model.output[:,1], model.inputs)
gradients2 = K.gradients(model.output, model.inputs)

I thought, the first two expressions yield the gradient for the single output neurons and the last one yields a tensor containing the first two expressions. 我以为,前两个表达式产生单个输出神经元的梯度,而后一个表达式产生包含前两个表达式的张量。 To my surprise, all three gradients have a shape of (1,50,200) . 令我惊讶的是,所有三个渐变的形状均为(1,50,200) In my opinion, gradients2 needs to be of shape (2,50,200) since model.output is two dimensional. 在我看来,由于model.output是二维的,所以gradients2的形状必须为(2,50,200) What is keras computing in this case? 在这种情况下,keras计算是什么?

Keras.backend.gradients() expects the output to be a scalar function, not a multi-dimensional one. Keras.backend.gradients()期望输出是标量函数,而不是多维函数。 I've found with a small example that K.gradients() performs identically that tf.gradients(). 我通过一个小示例发现K.gradients()的执行效果与tf.gradients()相同。 This way (as seen here: https://www.tensorflow.org/api_docs/python/tf/gradients ), your gradients2 is returning a list of Tensor of length len(xs) where each tensor is the sum(dy/dx) for y in ys , which explains why the first shape dimension is 1 and not 2. 这样(如此处所示: https : //www.tensorflow.org/api_docs/python/tf/gradients ),您的gradients2返回长度为len(xs)的张量的列表,其中每个张量都是sum(dy / dx )表示ys中的y ,这说明了为什么第一个形状尺寸是1而不是2。

This link can help you: Tensorflow gradient with respect to matrix 该链接可以为您提供帮助: 关于矩阵的Tensorflow梯度

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM