简体   繁体   English

使用 pytorch 在 neural.network 中定义展平层

[英]Define flatten layer in neural network using pytorch

I'm trying to define a flatten layer before initiating fully connected layer.我试图在启动完全连接层之前定义一个展平层。 As my input is a tensor with shape (512, 2, 2) , so I want to flatten this tensor before FC layers.由于我的输入是一个形状为(512, 2, 2)的张量,所以我想在 FC 层之前展平这个张量。

I used to get this error:我曾经遇到过这个错误:

empty(): argument 'size' must be tuple of ints, but found element of type Flatten at pos 2
import torch.nn as nn
class Network(nn.Module):
    def __init__(self):
        super(Network,self).__init__()
        self.flatten=nn.Flatten()
        self.fc1=nn.Linear(self.flatten,512)
        self.fc2=nn.Linear(512,256)
        self.fc3=nn.Linear(256,3)
 
        
    def forward(self,x):
        x=self.flatten(x) # Flatten layer
        x=torch.ReLU(self.fc1(x))  
        x=torch.ReLU(self.fc2(x))
        x=torch.softmax(self.fc3(x))
        return x

This line is not correct:此行不正确:

        self.fc1 = nn.Linear(self.flatten, 512)

the first argument in_features for nn.Linear should be int not the nn.Module nn.Linear的第一个参数in_features应该是int而不是nn.Module

in your case you defined flatten attribute as a nn.Flatten module:在您的情况下,您将flatten属性定义为nn.Flatten模块:

        self.flatten = nn.Flatten()

to fix this issue, you have to pass in_features equals to the number of feature after flattening:要解决此问题,您必须传递等于展平后特征数量的in_features

        self.fc1 = nn.Linear(n_features_after_flatten, 512)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM