简体   繁体   English

pytorch 导数不返回 on.grad

[英]pytorch derivative returns none on .grad

i1 = tr.tensor(0.0, requires_grad=True)
i2 = tr.tensor(0.0, requires_grad=True)
x =  tr.tensor(2*(math.cos(i1)*math.cos(i2) - math.sin(i1)*math.sin(i2)) + 3*math.cos(i1),requires_grad=True)
y =  tr.tensor(2*(math.sin(i1)*math.cos(i2) + math.cos(i1)*math.sin(i2)) + 3*math.sin(i1),requires_grad=True)
    
z = (x - (-2))**2 + (y - 3)**2
z.backward()
dz_t1 = i1.grad
dz_t2 = i2.grad
print(dz_t1)
print(dz_t2)

im trying to run the following code, but im facing an issue after z.backward() .我试图运行以下代码,但在z.backward()之后我面临一个问题。 i1.grad and i1.grad return none. i1.gradi1.grad不返回。 from what i understand the cause of this issue is with the way backward() is evaluated in torch.据我了解,这个问题的原因是在torch中评估backward()的方式。 so something along the lines of i1.retain_grad() has to be used to avoid this issue, i tried doing that but i still get none.所以必须使用i1.retain_grad()的东西来避免这个问题,我尝试这样做,但我仍然没有得到。 i1.retain_grad and i2.retain_grad() were placed before z.backward() and after z.backward() and i still get none as an answer. i1.retain_grad 和i2.retain_grad()放在z.backward()之前和z.backward() ) 之后,我仍然没有得到任何答案。 whats happening exactly and how do i fix it?到底发生了什么,我该如何解决? y.grad and x.grad work fine. y.gradx.grad工作正常。

Use:利用:

i1 = tr.tensor(0.0, requires_grad=True)
i2 = tr.tensor(0.0, requires_grad=True)
x =  2*(torch.cos(i1)*torch.cos(i2) - torch.sin(i1)*torch.sin(i2)) + 3*torch.cos(i1)
y =  2*(torch.sin(i1)*torch.cos(i2) + torch.cos(i1)*torch.sin(i2)) + 3*torch.sin(i1)
z = (x - (-2))**2 + (y - 3)**2
z.backward()
dz_t1 = i1.grad
dz_t2 = i2.grad
print(dz_t1)
print(dz_t2)

Here, using torch.sin and torch.cos ensures that the output is a torch tensor that is connected to i1 and i2 in the computational graph.在这里,使用torch.sintorch.cos可确保 output 是连接到计算图中i1i2的 Torch 张量。 Also, creating x and y using torch.tensor like you did detaches them from the existing graph, which again prevents gradients from flowing back through to i1 and i2 .此外,像您一样使用torch.tensor创建xy会将它们从现有图形中分离出来,这再次防止梯度流回i1i2

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM