简体   繁体   English

PyTorch Autograd自动微分功能

[英]PyTorch Autograd automatic differentiation feature

我只是想知道,怎么做对张量PyTorch跟踪操作(后.requires_grad被设置为True ,它是如何以后自动计算出梯度。请帮我了解背后的想法autograd 。谢谢。

That's a great question! 这是一个很好的问题! Generally, the idea of automatic differentiation ( AutoDiff ) is based on the multivariable chain rule, ie 通常,自动区分( AutoDiff )的概念基于多变量链规则,即 \\ frac {\\ partial x} {\\ partial z} = \\ frac {\\ partial x} {\\ partial y} \\ cdot \\ frac {\\ partial y} {\\ partial z} .
What this means is that you can express the derivative of x with respect to z via a "proxy" variable y; 这意味着您可以通过“代理”变量y表示x相对于z的导数; in fact, that allows you to break up almost any operation in a bunch of simpler (or atomic) operations that can then be "chained" together. 实际上,这允许您在一堆更简单(或原子)的操作中分解几乎任何操作,然后可以将它们“链接”在一起。
Now, what AutoDiff packages like Autograd do, is simply to store the derivative of such an atomic operation block, eg, a division, multiplication, etc. Then, at runtime, your provided forward pass formula (consisting of multiple of these blocks) can be easily turned into an exact derivative. 现在,像Autograd这样的AutoDiff软件包,只是存储这种原子操作块的衍生物,例如,除法,乘法等。然后,在运行时,你提供的前向传递公式(由多个这些块组成)可以很容易变成精确的衍生物。 Likewise, you can also provide derivatives for your own operations, should you think AutoDiff does not exactly do what you want it to. 同样,如果您认为AutoDiff不能完全按照您的意愿行事,您也可以为自己的运营提供衍生产品。

The advantage of AutoDiff over derivative approximations like finite differences is simply that this is an exact solution. AutoDiff优于衍生近似(如有限差分 )的优势很简单,这是一个精确的解决方案。

If you are further interested in how it works internally, I highly recommend the AutoDidact project , which aims to simplify the internals of an automatic differentiator, since there is usually also a lot of code optimization involved. 如果您对内部工作方式更感兴趣,我强烈推荐AutoDidact项目 ,该项目旨在简化自动差分器的内部,因为通常还涉及很多代码优化。 Also, this set of slides from a lecture I took was really helpful in understanding. 此外,我参加的一个讲座中的这组幻灯片对理解非常有帮助。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM