PyTorch:传递numpy数组进行权重初始化

ytr*_*ewq 7 python numpy initialization pytorch rnn

我想用np数组初始化RNN的参数。

在以下示例中,我想传递w给的参数rnn。我知道pytorch提供了许多初始化方法,例如Xavier,uniform等,但是是否可以通过传递numpy数组来初始化参数?

import numpy as np
import torch as nn
rng = np.random.RandomState(313)
w = rng.randn(input_size, hidden_size).astype(np.float32)

rnn = nn.RNN(input_size, hidden_size, num_layers)
Run Code Online (Sandbox Code Playgroud)

ben*_*che 3

首先,让我们注意nn.RNN有多个权重变量,请参阅文档

\n\n
\n

变量:

\n\n
    \n
  • weight_ih_l[k]\xe2\x80\x93 第 -th 层的可学习输入隐藏权重k,形状(hidden_size * input_size)k = 0。否则,形状为(hidden_size * hidden_size)
  • \n
  • weight_hh_l[k]k\xe2\x80\x93形状为第 -th 层的可学习隐藏-隐藏权重(hidden_size * hidden_size)
  • \n
  • bias_ih_l[k]\xe2\x80\x93 第 - 层的可学习输入隐藏偏差k,形状(hidden_size)
  • \n
  • bias_hh_l[k]\xe2\x80\x93 第 层的可学习隐藏-隐藏偏差k,形状(hidden_size)
  • \n
\n
\n\n

现在,每个变量(Parameter实例)都是您的属性nn.RNN。您可以通过两种方式访问​​它们并编辑它们,如下所示:

\n\n
    \n
  • Parameter解决方案 1:通过名称(rnn.weight_hh_lKrnn.weight_ih_lK等)访问所有 RNN属性:
  • \n
\n\n
import torch\nfrom torch import nn\nimport numpy as np\n\ninput_size, hidden_size, num_layers = 3, 4, 2\nuse_bias = True\nrng = np.random.RandomState(313)\n\nrnn = nn.RNN(input_size, hidden_size, num_layers, bias=use_bias)\n\ndef set_nn_parameter_data(layer, parameter_name, new_data):\n    param = getattr(layer, parameter_name)\n    param.data = new_data\n\nfor i in range(num_layers):\n    weights_hh_layer_i = rng.randn(hidden_size, hidden_size).astype(np.float32)\n    weights_ih_layer_i = rng.randn(hidden_size, hidden_size).astype(np.float32)\n    set_nn_parameter_data(rnn, "weight_hh_l{}".format(i), \n                          torch.from_numpy(weights_hh_layer_i))\n    set_nn_parameter_data(rnn, "weight_ih_l{}".format(i), \n                          torch.from_numpy(weights_ih_layer_i))\n\n    if use_bias:\n        bias_hh_layer_i = rng.randn(hidden_size).astype(np.float32)\n        bias_ih_layer_i = rng.randn(hidden_size).astype(np.float32)\n        set_nn_parameter_data(rnn, "bias_hh_l{}".format(i), \n                              torch.from_numpy(bias_hh_layer_i))\n        set_nn_parameter_data(rnn, "bias_ih_l{}".format(i), \n                              torch.from_numpy(bias_ih_layer_i))\n
Run Code Online (Sandbox Code Playgroud)\n\n
    \n
  • Parameter解决方案2:通过列表属性访问所有RNN属性rnn.all_weights
  • \n
\n\n
import torch\nfrom torch import nn\nimport numpy as np\n\ninput_size, hidden_size, num_layers = 3, 4, 2\nuse_bias = True\nrng = np.random.RandomState(313)\n\nrnn = nn.RNN(input_size, hidden_size, num_layers, bias=use_bias)\n\nfor i in range(num_layers):\n    weights_hh_layer_i = rng.randn(hidden_size, hidden_size).astype(np.float32)\n    weights_ih_layer_i = rng.randn(hidden_size, hidden_size).astype(np.float32)\n    rnn.all_weights[i][0].data = torch.from_numpy(weights_ih_layer_i)\n    rnn.all_weights[i][1].data = torch.from_numpy(weights_hh_layer_i)\n\n    if use_bias:\n        bias_hh_layer_i = rng.randn(hidden_size).astype(np.float32)\n        bias_ih_layer_i = rng.randn(hidden_size).astype(np.float32)\n        rnn.all_weights[i][2].data = torch.from_numpy(bias_ih_layer_i)\n        rnn.all_weights[i][3].data = torch.from_numpy(bias_hh_layer_i)\n
Run Code Online (Sandbox Code Playgroud)\n