简体   繁体   English

Pytorch 大小与 pad_packed_sequence、seq2seq 不一致

[英]Pytorch inconsistent size with pad_packed_sequence, seq2seq

I'm having some inconsistencies with the output of a encoder I got from this github .我从github获得的编码器的 output 存在一些不一致之处。

The encoder looks as follows:编码器如下所示:

class Encoder(nn.Module):
    r"""Applies a multi-layer LSTM to an variable length input sequence.
    """

    def __init__(self, input_size, hidden_size, num_layers,
                 dropout=0.0, bidirectional=True, rnn_type='lstm'):
        super(Encoder, self).__init__()
        self.input_size = 40
        self.hidden_size = 512
        self.num_layers = 8
        self.bidirectional = True
        self.rnn_type = 'lstm'
        self.dropout = 0.0
        if self.rnn_type == 'lstm':
            self.rnn = nn.LSTM(input_size, hidden_size, num_layers,
                               batch_first=True,
                               dropout=dropout,
                               bidirectional=bidirectional)

    def forward(self, padded_input, input_lengths):
        """
        Args:
            padded_input: N x T x D
            input_lengths: N
        Returns: output, hidden
            - **output**: N x T x H
            - **hidden**: (num_layers * num_directions) x N x H
        """
        total_length = padded_input.size(1)  # get the max sequence length
        packed_input = pack_padded_sequence(padded_input, input_lengths,
                                            batch_first=True,enforce_sorted=False)
        packed_output, hidden = self.rnn(packed_input)
        pdb.set_trace()
        output, _ = pad_packed_sequence(packed_output, batch_first=True, total_length=total_length)
        return output, hidden

So it only consists of a rnn lstm cell, if I print the encoder this is the output:所以它只包含一个 rnn lstm 单元,如果我打印编码器,这是 output:

LSTM(40, 512, num_layers=8, batch_first=True, bidirectional=True)

So it should have a 512 sized output right?所以它应该有一个 512 大小的 output 对吗? But when I feed a tensor with size torch.Size([16, 1025, 40]) 16 samples of 1025 vectors with size 40 (that gets packed to fit the RNN) the output that I get from the RNN has a new encoded size of 1024 torch.Size([16, 1025, 1024]) when it should have been encoded to 512 right?但是,当我输入一个尺寸torch.Size([16, 1025, 40])的张量时,16 个大小为 40 的 1025 个向量的样本(被打包以适合 RNN),我从 RNN 获得的 output 具有新的编码大小1024 torch.Size([16, 1025, 1024])应该被编码为 512 对吗?

Is there something Im missing?有什么我想念的吗?

Setting bidirectional=True makes the LSTM bidirectional, which means there will be two LSTMs, one that goes from left to right and the other that goes from right to left.设置 bidirectional bidirectional=True使 LSTM 是双向的,这意味着将有两个 LSTM,一个从左到右,另一个从右到左。

From the nn.LSTM documentation - Outputs :来自nn.LSTM文档 - 输出

  • output of shape (seq_len, batch, num_directions * hidden_size) : tensor containing the output features (h_t) from the last layer of the LSTM, for each t . output of shape (seq_len, batch, num_directions * hidden_size) :包含来自 LSTM 最后一层的 output 特征(h_t)的张量,对于每个t If a torch.nn.utils.rnn.PackedSequence has been given as the input, the output will also be a packed sequence.如果将torch.nn.utils.rnn.PackedSequence作为输入,则 output 也将是一个打包序列。

    For the unpacked case, the directions can be separated using output.view(seq_len, batch, num_directions, hidden_size) , with forward and backward being direction 0 and 1 respectively.对于未打包的情况,可以使用output.view(seq_len, batch, num_directions, hidden_size)将方向分开,向前和向后分别是方向 0 和 1。 Similarly, the directions can be separated in the packed case.类似地,可以在包装好的情况下分离方向。

Your output has the size [batch, seq_len, 2 * hidden_size] ( batch and seq_len are swapped in your case due to setting batch_first=True ) because of using a bidirectional LSTM.由于使用了双向 LSTM,因此您的 output 的大小为[batch, seq_len, 2 * hidden_size] (在您的情况下,由于设置batch_first=Truebatchseq_len被交换)。 The outputs of the two are concatenated in order to have the information of both, which you could easily separate if you wanted to treat them differently.将两者的输出连接起来以获得两者的信息,如果您想以不同的方式对待它们,您可以轻松地将它们分开。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM