相关疑难解决方法(0)

tf.layers.batch_normalization大测试错误

我正在尝试使用批量标准化.我试图在一个简单的转义网上使用tf.layers.batch_normalization for mnist.

我获得了高精度的列车步骤(> 98%)但测试精度非常低(<50%).我试图改变动量值(我尝试了0.8,0.9,0.99,0.999)并使用批量大小,但它总是表现得基本相同.我在20k迭代上训练它.

我的代码

# Input placeholders
x = tf.placeholder(tf.float32, [None, 784], name='x-input')
y_ = tf.placeholder(tf.float32, [None, 10], name='y-input')
is_training = tf.placeholder(tf.bool)

# inut layer
input_layer = tf.reshape(x, [-1, 28, 28, 1])
with tf.name_scope('conv1'):
    #Convlution #1 ([5,5] : [28x28x1]->[28x28x6])
    conv1 = tf.layers.conv2d(
        inputs=input_layer,
        filters=6,
        kernel_size=[5, 5],
        padding="same",
        activation=None
    )   

    #Batch Norm #1
    conv1_bn = tf.layers.batch_normalization(
        inputs=conv1,
        axis=-1,
        momentum=0.9,
        epsilon=0.001,
        center=True,
        scale=True,
        training = is_training,
        name='conv1_bn'
    )

    #apply relu
    conv1_bn_relu = tf.nn.relu(conv1_bn)
    #apply pool ([2,2] : [28x28x6]->[14X14X6])
    maxpool1=tf.layers.max_pooling2d(
        inputs=conv1_bn_relu, …
Run Code Online (Sandbox Code Playgroud)

batch-normalization

20
推荐指数
1
解决办法
1万
查看次数

标签 统计

batch-normalization ×1