如何使用 PyTorch 为堆叠的 LSTM 模型执行 return_sequences?

Sha*_*oon 3 python keras tensorflow pytorch

我有一个 Tensorflow/Keras 模型:


        self.model.add(Bidirectional(LSTM(lstm1_size, input_shape=(
            seq_length, feature_dim), return_sequences=True)))
        self.model.add(BatchNormalization())
        self.model.add(Dropout(0.2))

        self.model.add(Bidirectional(
            LSTM(lstm2_size, return_sequences=True)))
        self.model.add(BatchNormalization())
        self.model.add(Dropout(0.2))

        # BOTTLENECK HERE

        self.model.add(Bidirectional(
            LSTM(lstm3_size, return_sequences=True)))
        self.model.add(BatchNormalization())
        self.model.add(Dropout(0.2))

        self.model.add(Bidirectional(
            LSTM(lstm4_size, return_sequences=True)))
        self.model.add(BatchNormalization())
        self.model.add(Dropout(0.2))

        self.model.add(Bidirectional(
            LSTM(lstm5_size, return_sequences=True)))
        self.model.add(BatchNormalization())
        self.model.add(Dropout(0.2))

        self.model.add(Dense(feature_dim, activation='linear'))
Run Code Online (Sandbox Code Playgroud)

如何使用return_sequences? 我的理解return_sequences是它返回 LSTM 每一层的“输出”,然后将其送入下一层。

我将如何使用 PyToch 完成此任务?

Ale*_*hev 5

PyTorch 总是返回序列。

https://pytorch.org/docs/stable/nn.html#lstm

在此处输入图片说明

例子:

import torch as t

batch_size = 2
time_steps = 10
features = 2
data = t.empty(batch_size, time_steps, features).normal_()

lstm = t.nn.LSTM(input_size=2, hidden_size=3, bidirectional=True, batch_first=True)

output, (h_n, c_n) = lstm(data)
[output.shape, h_n.shape, c_n.shape]
Run Code Online (Sandbox Code Playgroud)

[torch.Size([2, 10, 6]), torch.Size([2, 2, 3]), torch.Size([2, 2, 3])]

class Net(t.nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.lstm_1 = t.nn.LSTM(input_size=2, hidden_size=3, bidirectional=True, batch_first=True)
        self.lstm_2 = t.nn.LSTM(input_size=2*3, hidden_size=4, bidirectional=True, batch_first=True)

    def forward(self, input):
        output, (h_n, c_n) = self.lstm_1(input)
        output, (h_n, c_n) = self.lstm_2(output)
        return output

net = Net()

net(data).shape
Run Code Online (Sandbox Code Playgroud)

torch.Size([2, 10, 8])