我试图使2转换层共享相同的权重,但是,似乎API不起作用.
import tensorflow as tf
x = tf.random_normal(shape=[10, 32, 32, 3])
with tf.variable_scope('foo') as scope:
conv1 = tf.contrib.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, scope=scope)
print(conv1.name)
conv2 = tf.contrib.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, scope=scope)
print(conv2.name)
Run Code Online (Sandbox Code Playgroud)
打印出来
foo/foo/Relu:0
foo/foo_1/Relu:0
Run Code Online (Sandbox Code Playgroud)
从改变tf.contrib.layers.conv2d到tf.layers.conv2d不解决问题.
它有同样的问题tf.layers.conv2d:
import tensorflow as tf
x = tf.random_normal(shape=[10, 32, 32, 3])
conv1 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=None, name='conv')
print(conv1.name)
conv2 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, name='conv')
print(conv2.name)
Run Code Online (Sandbox Code Playgroud)
给
conv/BiasAdd:0
conv_2/BiasAdd:0
Run Code Online (Sandbox Code Playgroud)
kev*_*man 16
在您编写的代码中,变量确实在两个卷积层之间重用.试试这个 :
import tensorflow as tf
x = tf.random_normal(shape=[10, 32, 32, 3])
conv1 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=None, name='conv')
conv2 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, name='conv')
print([x.name for x in tf.global_variables()])
# prints
# [u'conv/kernel:0', u'conv/bias:0']
Run Code Online (Sandbox Code Playgroud)
请注意,只创建了一个权重和一个偏差张量.即使它们共享权重,层也不共享实际计算.因此,您会看到操作的两个不同名称.
| 归档时间: |
|
| 查看次数: |
10396 次 |
| 最近记录: |