簡體   English   中英

PyTorch 中的 autograd 微分示例 - 應該是 9/8?

[英]autograd differentiation example in PyTorch - should be 9/8?

Python Torch 教程示例中,他們使用下圖:

x = [[1, 1], [1, 1]]
y = x + 2
z = 3y^2
o = mean( z )  # 1/4 * x.sum()

因此,前向傳球讓我們得到:

x_i = 1, y_i = 3, z_i = 27, o = 27

在代碼中,這看起來像:

import torch

# define graph
x = torch.ones(2, 2, requires_grad=True)
y = x + 2
z = y * y * 3
out = z.mean()

# if we don't do this, torch will only retain gradients for leaf nodes, ie: x
y.retain_grad()
z.retain_grad()

# does a forward pass
print(z, out)

但是,我對計算的梯度感到困惑:

# now let's run our backward prop & get gradients
out.backward()
print(f'do/dz = {z.grad[0,0]}')

輸出:

do/dx = 4.5

根據鏈式法則, do/dx = do/dz * dz/dy * dy/dx ,其中:

dy/dx = 1
dz/dy = 9/2 given x_i=1
do/dz = 1/4 given x_i=1

意思是:

do/dx = 1/4 * 9/2 * 1 = 9/8

但是,這與 Torch (9/2 = 4.5) 返回的梯度不匹配。 也許我有一個數學錯誤(與 do/dz = 1/4 術語有關?),或者我不理解 Torch 中的autograd

任何指針?

do/dz = 1 / 4
dz/dy = 6y = 6 * 3 = 18
dy/dx = 1

因此,do/dx = 9/2

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM