简体   繁体   English

如何在 Pytorch 中使用多个单元实现 LSTM 层?

[英]How to implement LSTM layer with multiple cells in Pytorch?

I intend to implement an LSTM with 2 layers and 256 cells in each layer.我打算实现一个具有 2 层和每层 256 个单元的 LSTM。 I am trying to understand the PyTorch LSTM framework for the same.我正在尝试理解 PyTorch LSTM 框架。 The variables in torch.nn.LSTM that I can edit are input_size, hidden_size, num_layers, bias, batch_first, dropout and bidirectional.我可以编辑的 torch.nn.LSTM 中的变量是 input_size、hidden_​​size、num_layers、bias、batch_first、dropout 和 bidirectional。

However, how do I have multiple cells in a single layer?但是,如何在单层中有多个单元格?

These cells will be automatically unrolled based on your sequence size in the input.这些单元格将根据您在输入中的序列大小自动展开。 Please check out this code:请查看此代码:

# One cell RNN input_dim (4) -> output_dim (2). sequence: 5, batch 3
# 3 batches 'hello', 'eolll', 'lleel'
# rank = (3, 5, 4)
inputs = Variable(torch.Tensor([[h, e, l, l, o],
                                [e, o, l, l, l],
                                [l, l, e, e, l]]))
print("input size", inputs.size())  # input size torch.Size([3, 5, 4])

# Propagate input through RNN
# Input: (batch, seq_len, input_size) when batch_first=True
# B x S x I
out, hidden = cell(inputs, hidden)
print("out size", out.size())  # out size torch.Size([3, 5, 2])

You can find more examples at https://github.com/hunkim/PyTorchZeroToAll/ .您可以在https://github.com/hunkim/PyTorchZeroToAll/找到更多示例。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM