简体   繁体   中英

How to write a PyTorch sequential model?

So far, I wrote my MLP, RNN and CNN in Keras, but now PyTorch is gaining popularity inside deep learning communities, and so I also started to learn this framework. I am a big fan of sequential models in Keras, which allow us to make simple models very fast. I also saw that PyTorch has this functionality, but I don't know how to code one. I tried this way

import torch
import torch.nn as nn

net = nn.Sequential()
net.add(nn.Linear(3, 4))
net.add(nn.Sigmoid())
net.add(nn.Linear(4, 1))
net.add(nn.Sigmoid())
net.float()

print(net)

but it is giving this error

AttributeError: 'Sequential' object has no attribute 'add'

Also, if possible, can you give simple examples for RNN and CNN models in PyTorch sequential model?

Sequential does not have an add method at the moment, though there is some debate about adding this functionality.

As you can read in the documentation nn.Sequential takes as argument the layers separeted as sequence of arguments or an OrderedDict .

If you have a model with lots of layers, you can create a list first and then use the * operator to expand the list into positional arguments, like this:

layers = []
layers.append(nn.Linear(3, 4))
layers.append(nn.Sigmoid())
layers.append(nn.Linear(4, 1))
layers.append(nn.Sigmoid())

net = nn.Sequential(*layers)

This will result in a similar structure of your code, as adding directly.

As described by the correct answer, this is what it would look as a sequence of arguments:

device = torch.device('cpu')
if torch.cuda.is_available():
    device = torch.device('cuda')

net = nn.Sequential(
      nn.Linear(3, 4),
      nn.Sigmoid(),
      nn.Linear(4, 1),
      nn.Sigmoid()
      ).to(device)


print(net)

Sequential(
  (0): Linear(in_features=3, out_features=4, bias=True)
  (1): Sigmoid()
  (2): Linear(in_features=4, out_features=1, bias=True)
  (3): Sigmoid()
  )

As McLawrence said nn.Sequential doesn't have the add method. I think maybe the codes in which you found the using of add could have lines that modified the torch.nn.Module.add to a function like this:

def add_module(self,module):
    self.add_module(str(len(self) + 1 ), module)

torch.nn.Module.add = add_module

after doing this, you can add a torch.nn.Module to a Sequential like you posted in the question.

layerlist = []
for i in layers:
    layerlist.append(nn.Linear(n_in, i))  # n_in input neurons connected to i number of output neurons
    layerlist.append(nn.ReLU(inplace=True))  # Apply activation function - ReLU
    layerlist.append(nn.BatchNorm1d(i))  # Apply batch normalization
    layerlist.append(nn.Dropout(p))  # Apply dropout to prevent overfitting
    n_in = i  # Reassign number of input neurons as the number of neurons from previous last layer

    # Establish the FCC between the last hidden layer and output layer
    layerlist.append(nn.Linear(layers[-1], out_sz))

    self.layers = nn.Sequential(*layerlist)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM