Chr*_*tte 1 python neural-network python-3.x pytorch
Python 版本:Python 3.8.5
Pytorch 版本:'1.6.0'
我正在定义 LSTM,它是 nn.Module 的子类。我正在尝试创建优化器,但收到以下错误:torch.nn.modules.module.ModuleAttributeError: 'LSTM' object has no attribute 'paramters'
我有两个代码文件,train.py 和 lstm_class.py(包含 LSTM 类)。我将尝试制作一个最小的工作示例,如果有任何其他信息有帮助,请告诉我。
lstm_class.py中的代码:
import torch.nn as nn
class LSTM(nn.Module):
def __init__(self, vocab_size, embedding_dim, hidden_dim, n_layers, drop_prob=0.2):
super(LSTM, self).__init__()
# network size parameters
self.n_layers = n_layers
self.hidden_dim = hidden_dim
self.vocab_size = vocab_size
self.embedding_dim = embedding_dim
# the layers of the network
self.embedding = nn.Embedding(self.vocab_size, self.embedding_dim)
self.lstm = nn.LSTM(self.embedding_dim, self.hidden_dim, self.n_layers, dropout=drop_prob, batch_first=True)
self.dropout = nn.Dropout(drop_prob)
self.fc = nn.Linear(self.hidden_dim, self.vocab_size)
def forward(self, input, hidden):
# Defines forward pass, probably isn't relevant
def init_hidden(self, batch_size):
#Initializes hidden state, probably isn't relevant
Run Code Online (Sandbox Code Playgroud)
train.py 中的代码
import torch
import torch.optim
import torch.nn as nn
import lstm_class
vocab_size = 1000
embedding_dim = 256
hidden_dim = 256
n_layers = 2
net = lstm_class.LSTM(vocab_size, embedding_dim, hidden_dim, n_layers)
optimizer = torch.optim.Adam(net.paramters(), lr=learning_rate)
Run Code Online (Sandbox Code Playgroud)
我在上面写的最后一行收到错误。完整的错误消息:
Traceback (most recent call last):
File "train.py", line 58, in <module>
optimizer = torch.optim.Adam(net.paramters(), lr=learning_rate)
File "/usr/local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 771, in __getattr__
raise ModuleAttributeError("'{}' object has no attribute '{}'".format(
torch.nn.modules.module.ModuleAttributeError: 'LSTM' object has no attribute 'paramters'
Run Code Online (Sandbox Code Playgroud)
任何有关如何解决此问题的提示将不胜感激。同样如上所述,如果还有其他相关的内容,请告诉我。谢谢