简体   繁体   中英

List of Differentiable Ops in Tensorflow

Is there a master list of Tensorflow ops that are differentiable (ie, will auto-differentiate)?

Two other ways to phrase this:

  • List of ops that do not have ops.NoGradient set.
  • List of ops that will not trigger LookupError .

For example, I'd assume that all the Control Flow ops are not differentiable (eg, tf.where ). How would I find this other than by manually running them all through tf.gradients to see if they throw the LookupError .

"Commonsense" is not a valid answer.

Thanks.

EDIT:

tf.where is differentiable so my intuitions are wrong. Perhaps the correct question here is which ops in Tensorflow are not differentiable.

Thanks.

I have devised the entire list of Differentiable and Non-Differentiable Ops using python code.

You will find the compact list here. Also the code which generated it.

https://github.com/Mainak431/List-of-Differentiable--OPs-and-Non-differentiable-OPs--in-Tensorflow

No, there is no list (you can be the first one to create it). Also as far as I am aware, documentation of each function also does not tell it ( tf.size is non-differentiable but does not tell about it).

Apart from the way you suggested, you can also extract this data from the source code. For example all the ops that have gradient implemented, have @ops.RegisterGradient in front of the method declaration. For ops which do not have gradient you will have ops.NotDifferentiable(

Not related, but probably helpful .

对于 TensorFlow 2,似乎tf.raw_ops模块文档中提供了这样的列表。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM