简体   繁体   English

关于理解Pytorch.named_modules()循环的问题

[英]Question about understanding of Pytorch .named_modules() loops

I am referring to this implementation here:我在这里指的是这个实现:

https://github.com/hszhao/semseg/blob/master/model/pspnet.pyhttps://github.com/hszhao/semseg/blob/master/model/pspnet.py

In line 49-58, the author writes:在第 49-58 行,作者写道:

for n, m in self.layer3.named_modules():
    if 'conv2' in n:
        m.dilation, m.padding, m.stride = (2, 2), (2, 2), (1, 1)
    elif 'downsample.0' in n:
        m.stride = (1, 1)

for n, m in self.layer4.named_modules():
    if 'conv2' in n:
        m.dilation, m.padding, m.stride = (4, 4), (4, 4), (1, 1)
    elif 'downsample.0' in n:
        m.stride = (1, 1)

What exactly is happening in these loops?这些循环中究竟发生了什么?

My understanding is, that the author is creatig a resnet model (his resnet.py here https://github.com/hszhao/semseg/blob/master/model/resnet.py ) and then is calling the different layers, which he implemented in his resnet class to forward them below.我的理解是,作者创建了一个 resnet model (他的 resnet.py 在这里https://github.com/hszhao/semseg/blob/master/model/resnet.py )然后调用不同的层,他在他的 resnet class 中实现以在下面转发它们。

layer3 and layer4 in resnet.py are made by calling the function def _make_layer(self, block, planes, blocks, stride=1): , so I assume that when .named_modules() is used in the loops, it is looping through the modules in this def _make_layer function, is it? resnet.py 中的 layer3 和 layer4 是通过调用 function def _make_layer(self, block, planes, blocks, stride=1):制作的,所以我假设当在循环中使用.named_modules()时,它正在循环这个def _make_layer function 中的模块,是吗? If so, what happens in the elif part?如果是这样,elif 部分会发生什么? There is no module, that is called downsample.0 ?没有称为downsample.0的模块? (The only modules are nn.Conv2d and nn.BatchNorm2d ) (唯一的模块是 nn.Conv2d 和 nn.BatchNorm2d )

Below is an example of resnet that used there.下面是在那里使用的 resnet 的示例。

model = ResNet(Bottleneck, [3, 4, 6, 3], **kwargs)

In Resnet class, it calls super because of this, it has self.downsample If it's not none:在 Resnet class 中,它调用 super 是因为这个,它有self.downsample如果不是 none:

if self.downsample is not None:
        residual = self.downsample(x)

it could have Sequential or another layer.它可能有顺序或其他层。

class Bottleneck(nn.Module):
    expansion = 4

    def __init__(self, inplanes, planes, stride=1, downsample=None):
        super(Bottleneck, self).__init__()
        self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=1, bias=False)
        self.bn1 = nn.BatchNorm2d(planes)
        self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, stride=stride,
                               padding=1, bias=False)
        self.bn2 = nn.BatchNorm2d(planes)
        self.conv3 = nn.Conv2d(planes, planes * self.expansion, kernel_size=1, bias=False)
        self.bn3 = nn.BatchNorm2d(planes * self.expansion)
        self.relu = nn.ReLU(inplace=True)
        self.downsample = downsample
        self.stride = stride

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM