[英]How to find partial derivative in pytorch
I have a model u(x,t)
with layers 2X50
, then 50X50
, and 50X1
.我有一个 model
u(x,t)
层2X50
,然后50X50
和50X1
。
I train the model with input x,t
of size [100,2]
.我训练 model 输入
x,t
大小[100,2]
。 In the final layer I get u
and now I want to differentiate it w.r.t to x
and t
and double differentiate w.r.t to x
.在最后一层我得到
u
现在我想将它 w.r.t 区分为x
和t
并将 w.r.t 区分为x
。 How do I do this in PyTorch?如何在 PyTorch 中执行此操作?
You can use PyTorch's autograd engine like so:你可以像这样使用 PyTorch 的autograd 引擎:
import torch
x = torch.randn(100, requires_grad=True)
t = torch.randn(2, requires_grad=True)
u = u(x,t)
# 1st derivatives
dt = torch.autograd.grad(u, t)[0]
dx = torch.autograd.grad(u, x, create_graph=True)[0]
# 2nd derivatives (higher orders require `create_graph=True`)
ddx = torch.autograd.grad(dx, x)[0]
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.