简体   繁体   English

如何在PyTorch中查找和了解autograd源代码

[英]How to find and understand the autograd source code in PyTorch

I have a good understanding of the autograd algorithm, and I think I should learn about the source code in PyTorch. 我对autograd算法有很好的了解,我认为我应该了解PyTorch中的源代码。 However, when I see the project on GitHub, I am confused by the structure, cuz so many files include autograd. 但是,当我在GitHub上看到该项目时,我对结构感到困惑,因为很多文件都包含autograd。 So which part is the most important core code of autograd? 那么哪一部分是autograd最重要的核心代码?

  1. Try to understand the autograd variable is probably the first thing, what you can do. 尝试了解autograd 变量可能是第一件事,您可以做什么。 From my understanding is autograd only a naming for the modules, which containing classes with enhancement of gradients and backward functions. 据我了解,autograd仅是模块的命名,其中包含具有渐变和向后功能增强的类。

  2. Be aware a lot of the algorithm, eg back-prop through the graph, is hidden in compiled code. 请注意,很多算法(例如通过图形的反向传播)都隐藏在编译后的代码中。

  3. If you look into the __init__.py , you can get a glimpse about all important functions (backward & grad) 如果您查看__init__.py ,则可以了解所有重要功能(向后和渐变)

I recommend you make associations between your understanding of autograd and the PyTorch datastructures involved by making a simple graph and printing/visualizing the structure as below: 我建议您通过制作一个简单的图形并打印/可视化如下结构,来将对autograd的理解与所涉及的PyTorch数据结构联系起来:

在此处输入图片说明

Reading the PyTorch code is doable, but you may be overwhelmed with details. 读取PyTorch代码是可行的,但是您可能对细节不知所措。 To get a basic idea of autograd, you may want to refer to some simple autograd implementations, such as https://evcu.github.io/ml/autograd/ and https://medium.com/@ralphmao95/simple-autograd-implementation-understand-automatic-differentiation-hand-by-hand-9e86f6d703ab 要了解autograd的基本概念,您可能需要参考一些简单的autograd实现,例如https://evcu.github.io/ml/autograd/https://medium.com/@ralphmao95/simple-autograd实现理解手工自动分化9e86f6d703ab

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM