运行Adam Optimizer

Jse*_*mol 14 python conv-neural-network tensorflow

我试图运行一个AdamOptimizer进行一步训练,不成功.

optimizer = tf.train.AdamOptimizer(learning_rate)
init = tf.global_variables_initializer()

with tf.Session() as sess:
    sess.run(init)
    sess.run(optimizer.minimize(cost), feed_dict={X:X_data, Y: Y_data})
Run Code Online (Sandbox Code Playgroud)

控制台吐出一个难看的错误:

FailedPreconditionError (see above for traceback): Attempting to use uninitialized value beta1_power
 [[Node: beta1_power/read = Identity[T=DT_FLOAT, _class=["loc:@W1"], _device="/job:localhost/replica:0/task:0/cpu:0"](beta1_power)]]
Run Code Online (Sandbox Code Playgroud)

在代码中,cost是一个明确定义的函数,使用两个参数X,Y(分别为NN和训练标签的条目)实现conv NN加上后勤丢失函数

关于什么可能出错的任何想法?

nes*_*uno 19

optimizer.minimize(cost) 正在图表中创建新的值和变量.

当您调用方法创建sess.run(init)的变量.minimize尚未定义时:从此错误.

您只需在调用之前声明最小化操作tf.global_variables_initializer():

optimizer = tf.train.AdamOptimizer(learning_rate)
minimize = optimizer.minimize(cost)
init = tf.global_variables_initializer()

with tf.Session() as sess:
    sess.run(init)
    sess.run(minimize, feed_dict={X:X_data, Y: Y_data})
Run Code Online (Sandbox Code Playgroud)