简体   繁体   English

如何使用autograd查找MIN / MAX点

[英]How to use autograd find MIN/MAX point

假设我们有一个简单的函数y = sin(x ** 2),如何使用autograd查找所有一阶导数值为0的X:s?

Below code can find the point where the first derivative is zero. 下面的代码可以找到一阶导数为零的点。 However, depending on random initialization it will only find one point. 但是,根据随机初始化,只会找到一个点。 If you want to find all the points, you can try iterating over a lot of random initialization on some desired grid. 如果要查找所有点,则可以尝试在某个所需的网格上迭代许多随机初始化。

import torch 
import numpy as np
# initialization
x = torch.tensor(np.random.rand(1)).requires_grad_(True)

while (x.grad is None or torch.abs(x.grad)>0.01):
    if (x.grad is not None):
        # zero grads
        x.grad.data.zero_()
    # compute fn
    y = torch.sin(x**2)
    # compute grads
    y.backward()
    # move in direction of / opposite to grads
    x.data = x.data - 0.01*x.grad.data
    # use below line to move uphill 
    # x.data = x.data + 0.01*x.grad.data

print(x)
print(y)
print(x.grad)

Also see how to apply gradients manually in pytorch 另请参阅如何在pytorch中手动应用渐变

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM