什么是获取pytorch模型并获得所有图层的列表而不进行任何nn.Sequence分组的最简单方法?例如,更好的方法吗?
import pretrainedmodels
model = pretrainedmodels.__dict__['xception'](num_classes=1000, pretrained='imagenet')
l = []
def unwrap_model(model):
for i in children(model):
if isinstance(i, nn.Sequential): unwrap_model(i)
else: l.append(i)
unwrap_model(model)
print(l)
Run Code Online (Sandbox Code Playgroud)
小智 12
我将其网络化以获得更深入的模型,但并非所有块都来自 nn.sequential。
def get_children(model: torch.nn.Module):
# get children form model!
children = list(model.children())
flatt_children = []
if children == []:
# if model has no children; model is last child! :O
return model
else:
# look for children from children... to the last child!
for child in children:
try:
flatt_children.extend(get_children(child))
except TypeError:
flatt_children.append(get_children(child))
return flatt_children
Run Code Online (Sandbox Code Playgroud)
小智 12
如果您想要命名的图层dict,这是最简单的方法:
named_layers = dict(model.named_modules())
Run Code Online (Sandbox Code Playgroud)
这会返回类似以下内容的内容:
{
'conv1': <some conv layer>,
'fc1': < some fc layer>,
### and other layers
}
Run Code Online (Sandbox Code Playgroud)
例子:
named_layers = dict(model.named_modules())
Run Code Online (Sandbox Code Playgroud)
您可以使用modules()方法遍历模型的所有模块。它也进入每个内部Sequantial。
l = [module for module in model.modules() if type(module) != nn.Sequential]
Run Code Online (Sandbox Code Playgroud)
这是一个简单的例子:
model = nn.Sequential(nn.Linear(2, 2),
nn.ReLU(),
nn.Sequential(nn.Linear(2, 1), nn.Sigmoid()))
Run Code Online (Sandbox Code Playgroud)
输出:
[Linear(in_features=2, out_features=2, bias=True),
ReLU(),
Linear(in_features=2, out_features=1, bias=True),
Sigmoid()]
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
2757 次 |
| 最近记录: |