L0K*_*KiZ 4 python-3.x deep-learning resnet pytorch
我正在尝试实现以下 ResNet 块,其中 ResNet 由具有两个卷积层和一个跳过连接的块组成。出于某种原因,它不会将跳过连接的输出(如果应用)或输入添加到卷积层的输出。
ResNet 模块具有:
两个卷积层:
跳过连接:
ReLU 非线性应用于第一个卷积层之后和块的末尾。
我的代码:
class Block(nn.Module):
def __init__(self, in_channels, out_channels, stride=1):
"""
Args:
in_channels (int): Number of input channels.
out_channels (int): Number of output channels.
stride (int): Controls the stride.
"""
super(Block, self).__init__()
self.skip = nn.Sequential()
if stride != 1 or in_channels != out_channels:
self.skip = nn.Sequential(
nn.Conv2d(in_channels=in_channels, out_channels=out_channels, kernel_size=1, stride=stride, bias=False),
nn.BatchNorm2d(out_channels))
else:
self.skip = None
self.block = nn.Sequential(
nn.Conv2d(in_channels=in_channels, out_channels=out_channels, kernel_size=3, padding=1, stride=1, bias=False),
nn.BatchNorm2d(out_channels),
nn.ReLU(),
nn.Conv2d(in_channels=in_channels, out_channels=out_channels, kernel_size=3, padding=1, stride=1, bias=False),
nn.BatchNorm2d(out_channels))
def forward(self, x):
out = self.block(x)
if self.skip is not None:
out = self.skip(x)
else:
out = x
out += x
out = F.relu(out)
return out
Run Code Online (Sandbox Code Playgroud)
问题在于out变量的重用。通常,你会像这样实现:
def forward(self, x):
identity = x
out = self.block(x)
if self.skip is not None:
identity = self.skip(x)
out += identity
out = F.relu(out)
return out
Run Code Online (Sandbox Code Playgroud)
如果你喜欢“单线”:
def forward(self, x):
out = self.block(x)
out += (x if self.skip is None else self.skip(x))
out = F.relu(out)
return out
Run Code Online (Sandbox Code Playgroud)
如果你真的喜欢单线(拜托,那太多了,不要选择这个选项:))
def forward(self, x):
return F.relu(self.block(x) + (x if self.skip is None else self.skip(x)))
Run Code Online (Sandbox Code Playgroud)