Uiy*_*Kim 4 tensorflow pytorch
pytorch 是否有任何等效的 tensorflow.keras.layers.Timedistributed 实现?
我正在尝试构建类似 Timedistributed(Resnet50()) 的东西。
在此主题上归功于 miguelvr 。
您可以使用此代码,它是为模仿 Timeditributed 包装器而开发的 PyTorch 模块。
import torch.nn as nn
class TimeDistributed(nn.Module):
def __init__(self, module, batch_first=False):
super(TimeDistributed, self).__init__()
self.module = module
self.batch_first = batch_first
def forward(self, x):
if len(x.size()) <= 2:
return self.module(x)
# Squash samples and timesteps into a single axis
x_reshape = x.contiguous().view(-1, x.size(-1)) # (samples * timesteps, input_size)
y = self.module(x_reshape)
# We have to reshape Y
if self.batch_first:
y = y.contiguous().view(x.size(0), -1, y.size(-1)) # (samples, timesteps, output_size)
else:
y = y.view(-1, x.size(1), y.size(-1)) # (timesteps, samples, output_size)
return y
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
1527 次 |
| 最近记录: |