简体   繁体   English

在张量流中计算二阶导数时出现错误

[英]Error while computing second derivatives in tensorflow

I am training a model that requires computation of second derivatives (ie) gradients of gradients. 我正在训练一个需要计算二阶导数(即)梯度的模型。 Here is a short snippet that does that: 这是一个简短的代码片段:

mapping_loss = tf.losses.sparse_softmax_cross_entropy(
    1 - adversary_label, adversary_logits)
adversary_loss = tf.losses.sparse_softmax_cross_entropy(
    adversary_label, adversary_logits)

''' # doesnt work using  tf.nn.softmax_cross_entropy_with_logits too.

mapping_loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(
    labels = tf.one_hot(1 - adversary_label, 2), logits = adversary_logits, name='loss1'))
adversary_loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(
    labels = tf.one_hot(adversary_label, 2), logits = adversary_logits, name = 'loss2'))
'''

grads_target = tf.gradients(mapping_loss, target_vars.values())
grads_adv = tf.gradients(adversary_loss, adversary_vars.values())

grads_all = grads_target + grads_adv

reg = 0.5*sum(tf.reduce_sum(tf.square(g)) for g in grads_all)
Jgrads_target = tf.gradients(reg, target_vars.values())
Jgrads_adv = tf.gradients(reg, adversary_vars.values())

I am getting the following error 我收到以下错误

Traceback (most recent call last):
  File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/tensorflow/python/ops/gradients_impl.py", line 455, in gradients
    grad_fn = ops.get_gradient_function(op)
  File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1682, in get_gradient_function
    return _gradient_registry.lookup(op_type)
  File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/tensorflow/python/framework/registry.py", line 93, in lookup
    "%s registry has no entry for: %s" % (self._name, name))
LookupError: gradient registry has no entry for: PreventGradient

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "tools/train_adda.py", line 215, in <module>
    main()
  File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/click/core.py", line 722, in __call__
    return self.main(*args, **kwargs)
  File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/click/core.py", line 697, in main
    rv = self.invoke(ctx)
  File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/click/core.py", line 895, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/click/core.py", line 535, in invoke
    return callback(*args, **kwargs)
  File "tools/train_adda.py", line 137, in main
    Jgrads_target = tf.gradients(reg, list(target_vars.values()))
  File "/scratch0/Projects/summer/adda/env/lib/python3.6/site-packages/tensorflow/python/ops/gradients_impl.py", line 459, in gradients
    (op.name, op.type))
LookupError: No gradient defined for operation 'gradients_1/sparse_softmax_cross_entropy_loss_1/xentropy/xentropy_grad/PreventGradient' (op type: PreventGradient)

It seems like TensorFlow does not support second derivatives of softmax cross entropy at the moment. 看来TensorFlow目前不支持softmax交叉熵的二阶导数。 See https://github.com/tensorflow/tensorflow/blob/c2ce4f68c744e6d328746b144ff1fcf98ac99e6c/tensorflow/python/ops/nn_grad.py#L449 参见https://github.com/tensorflow/tensorflow/blob/c2ce4f68c744e6d328746b144ff1fcf98ac99e6c/tensorflow/python/ops/nn_grad.py#L449

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM