Sha*_*oon 6 python dimensionality-reduction lstm pytorch
我有:
def __init__(self, feature_dim=15, hidden_size=5, num_layers=2):
super(BaselineModel, self).__init__()
self.num_layers = num_layers
self.hidden_size = hidden_size
self.lstm = nn.LSTM(input_size=feature_dim,
hidden_size=hidden_size, num_layers=num_layers)
Run Code Online (Sandbox Code Playgroud)
然后我收到一个错误:
RuntimeError: The size of tensor a (5) must match the size of tensor b (15) at non-singleton dimension 2
Run Code Online (Sandbox Code Playgroud)
如果我将两个尺寸设置为相同,那么错误就会消失。但我想知道我的数字input_size
是否很大,比如 15,我想将隐藏特征的数量减少到 5,为什么不应该这样工作?
它应该可以工作,错误可能来自其他地方。这项工作例如:
feature_dim = 15
hidden_size = 5
num_layers = 2
seq_len = 5
batch_size = 3
lstm = nn.LSTM(input_size=feature_dim,
hidden_size=hidden_size, num_layers=num_layers)
t1 = torch.from_numpy(np.random.uniform(0,1,size=(seq_len, batch_size, feature_dim))).float()
output, states = lstm.forward(t1)
hidden_state, cell_state = states
print("output: ",output.size())
print("hidden_state: ",hidden_state.size())
print("cell_state: ",cell_state.size())
Run Code Online (Sandbox Code Playgroud)
并返回
output: torch.Size([5, 3, 5])
hidden_state: torch.Size([2, 3, 5])
cell_state: torch.Size([2, 3, 5])
Run Code Online (Sandbox Code Playgroud)
您是否在 lstm 之后的某个地方使用输出?您是否注意到它的大小等于隐藏的暗淡,即最后一个暗淡的 5?看起来您后来使用它时认为它的尺寸为 15
归档时间: |
|
查看次数: |
674 次 |
最近记录: |