小编Nic*_*ick的帖子

PyTorch 教程中的交叉熵计算

我正在阅读多类分类问题的 Pytorch 教程。我发现 Pytorch 中 Loss 计算的行为让我很困惑。你能帮我解决这个问题吗?

用于分类的模型如下:

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(3, 6, 5)
        self.pool = nn.MaxPool2d(2, 2)
        self.conv2 = nn.Conv2d(6, 16, 5)
        self.fc1 = nn.Linear(16 * 5 * 5, 120)
        self.fc2 = nn.Linear(120, 84)
        self.fc3 = nn.Linear(84, 10)

    def forward(self, x):
        x = self.pool(F.relu(self.conv1(x)))
        x = self.pool(F.relu(self.conv2(x)))
        x = x.view(-1, 16 * 5 * 5)
        x = F.relu(self.fc1(x))
        x = F.relu(self.fc2(x))
        x = self.fc3(x)
        return x
Run Code Online (Sandbox Code Playgroud)

训练过程如下:

optimizer.zero_grad()
outputs = net(inputs)
loss = nn.CrossEntropyLoss(outputs, …
Run Code Online (Sandbox Code Playgroud)

machine-learning deep-learning cross-entropy pytorch

3
推荐指数
1
解决办法
3083
查看次数