[英]autograd differentiation example in PyTorch - should be 9/8?
In the example for the Torch tutorial for Python , they use the following graph:在Python Torch 教程的示例中,他们使用下图:
x = [[1, 1], [1, 1]]
y = x + 2
z = 3y^2
o = mean( z ) # 1/4 * x.sum()
Thus, the forward pass gets us this:因此,前向传球让我们得到:
x_i = 1, y_i = 3, z_i = 27, o = 27
In code this looks like:在代码中,这看起来像:
import torch
# define graph
x = torch.ones(2, 2, requires_grad=True)
y = x + 2
z = y * y * 3
out = z.mean()
# if we don't do this, torch will only retain gradients for leaf nodes, ie: x
y.retain_grad()
z.retain_grad()
# does a forward pass
print(z, out)
however, I get confused at the gradients computed:但是,我对计算的梯度感到困惑:
# now let's run our backward prop & get gradients
out.backward()
print(f'do/dz = {z.grad[0,0]}')
which outputs:输出:
do/dx = 4.5
By chain rule, do/dx = do/dz * dz/dy * dy/dx
, where:根据链式法则, do/dx = do/dz * dz/dy * dy/dx
,其中:
dy/dx = 1
dz/dy = 9/2 given x_i=1
do/dz = 1/4 given x_i=1
which means:意思是:
do/dx = 1/4 * 9/2 * 1 = 9/8
However this doesn't match the gradients returned by Torch (9/2 = 4.5).但是,这与 Torch (9/2 = 4.5) 返回的梯度不匹配。 Perhaps I have a math error (something with the do/dz = 1/4 term?), or I don't understand autograd
in Torch.也许我有一个数学错误(与 do/dz = 1/4 术语有关?),或者我不理解 Torch 中的autograd
。
Any pointers?任何指针?
do/dz = 1 / 4
dz/dy = 6y = 6 * 3 = 18
dy/dx = 1
therefore, do/dx = 9/2因此,do/dx = 9/2
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.