我在使用BatchNorm1d时遇到错误,代码:
\n##% first I set a model\nclass net(nn.Module):\n def __init__(self, max_len, feature_linear, rnn, input_size, hidden_size, output_dim, num__rnn_layers, bidirectional, batch_first=True, p=0.2):\n super(net, self).__init__()\n self.max_len = max_len\n self.feature_linear = feature_linear\n self.input_size = input_size\n self.hidden_size = hidden_size\n self.bidirectional = bidirectional\n self.num_directions = 2 if bidirectional == True else 1\n self.p = p\n self.batch_first = batch_first\n self.linear1 = nn.Linear(max_len, feature_linear) \n init.kaiming_normal_(self.linear1.weight, mode='fan_in')\n self.BN1 = BN(feature_linear) \n \n def forward(self, xb, seq_len_crt):\n rnn_input = torch.zeros(xb.shape[0], self.feature_linear, self.input_size)\n for i in range(self.input_size): \n out …
Run Code Online (Sandbox Code Playgroud)