Pytorch 张量,如何切换通道位置 - 运行时错误

New*_*irl 12 python-3.x conv-neural-network pytorch

我的训练数据集如下,其中 X_train 是具有 3 个通道的 3D

X_Train 的形状:(708, 256, 3) Y_Train 的形状:(708, 4)

然后我将它们转换为张量并输入到数据加载器中:

X_train=torch.from_numpy(X_data)
y_train=torch.from_numpy(y_data)
training_dataset = torch.utils.data.TensorDataset(X_train, y_train)
train_loader = torch.utils.data.DataLoader(training_dataset, batch_size=50, shuffle=False)
Run Code Online (Sandbox Code Playgroud)

但是,在训练模型时,我收到以下错误:RuntimeError: Given groups=1, weight of size 24 3 5, expected input[708, 256, 3] to have 3 channels, but got 256 channels代替

我想这是由于频道的位置?在 Tensorflow 中,通道位置在末尾,但在 PyTorch 中,格式是“批量大小 x 通道 x 高度 x 宽度”?那么如何交换 x_train 张量中的位置以匹配数据加载器中的预期格式?

class TwoLayerNet(torch.nn.Module):
    def __init__(self):
        super(TwoLayerNet,self).__init__()
        self.conv1 = nn.Sequential(
            nn.Conv1d(3, 3*8, kernel_size=5, stride=1),  
            nn.Sigmoid(),
            nn.AvgPool1d(kernel_size=2, stride=0))
        self.conv2 = nn.Sequential(
            nn.Conv1d(3*8, 12, kernel_size=5, stride=1),
            nn.Sigmoid(),
            nn.AvgPool1d(kernel_size=2, stride = 0))
        #self.drop_out = nn.Dropout()

        self.fc1 = nn.Linear(708, 732) 
        self.fc2 = nn.Linear(732, 4)

    def forward(self, x):
        out = self.conv1(x)
        out = self.conv2(out)
        out = out.reshape(out.size(0), -1)
        out = self.drop_out(out)
        out = self.fc1(out)
        out = self.fc2(out)
        return out
Run Code Online (Sandbox Code Playgroud)

Coo*_*ess 19

使用permute.

X_train = torch.rand(708, 256, 3)
X_train = X_train.permute(2, 0, 1)
X_train.shape
# => torch.Size([3, 708, 256])
Run Code Online (Sandbox Code Playgroud)

  • @NewGirl 使用“X_train.unsqueeze(0)”,您缺少“batch”维度。 (2认同)