我创建了一个具有三个卷积层和两个全连接层的卷积神经网络。我用来tf.train.saver()保存变量。当我inspect_checkpoint.py用来检查保存在检查点文件中的变量时。为什么每层都保存了两个额外的变量,比如Adam_1和Adam?另外,什么是beta1_power和beta2_power?
conv_layer1_b (DT_FLOAT) [32]
conv_layer1_w (DT_FLOAT) [1,16,1,32]
conv_layer1_b/Adam (DT_FLOAT) [32]
conv_layer1_w/Adam (DT_FLOAT) [1,16,1,32]
conv_layer1_w/Adam_1 (DT_FLOAT) [1,16,1,32]
conv_layer1_b/Adam_1 (DT_FLOAT) [32]
conv_layer3_w/Adam (DT_FLOAT) [1,16,64,64]
conv_layer3_w (DT_FLOAT) [1,16,64,64]
conv_layer3_b/Adam_1 (DT_FLOAT) [64]
conv_layer3_b (DT_FLOAT) [64]
conv_layer3_b/Adam (DT_FLOAT) [64]
conv_layer3_w/Adam_1 (DT_FLOAT) [1,16,64,64]
conv_layer2_w/Adam_1 (DT_FLOAT) [1,16,32,64]
conv_layer2_w/Adam (DT_FLOAT) [1,16,32,64]
conv_layer2_w (DT_FLOAT) [1,16,32,64]
conv_layer2_b/Adam_1 (DT_FLOAT) [64]
conv_layer2_b (DT_FLOAT) [64]
conv_layer2_b/Adam (DT_FLOAT) [64]
beta1_power (DT_FLOAT) []
beta2_power (DT_FLOAT) []
NN1_w (DT_FLOAT) [2432,512]
NN1_b (DT_FLOAT) …Run Code Online (Sandbox Code Playgroud)