That is to say if I have a differentiable model g, and a differentiable function f (which could also include models).
with tf.GradientTape(persistent=True) as tape:
for_ in range(n):
r = g(r)
loss = f(r)
grad = tape.gradient(loss, g.trainable_variables)
would tape.gradient apply backpropagation through time for n steps on g?
You can use matplotlib to create two graphs and see how they compare. The red line is for the target and the blue line is for the prediction result.
tf.GradientTape(
persistent=False, watch_accessed_variables=True
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.