我有两个网络:一个Model生成输出,一个Adversary对输出进行分级.
两者都经过单独培训,但现在我需要在单个会话期间将它们的输出结合起来.
我试图实现这篇文章中提出的解决方案:同时运行多个预先训练的Tensorflow网络
我的代码
with tf.name_scope("model"):
model = Model(args)
with tf.name_scope("adv"):
adversary = Adversary(adv_args)
#...
with tf.Session() as sess:
tf.global_variables_initializer().run()
# Get the variables specific to the `Model`
# Also strip out the surperfluous ":0" for some reason not saved in the checkpoint
model_varlist = {v.name.lstrip("model/")[:-2]: v
for v in tf.global_variables() if v.name[:5] == "model"}
model_saver = tf.train.Saver(var_list=model_varlist)
model_ckpt = tf.train.get_checkpoint_state(args.save_dir)
model_saver.restore(sess, model_ckpt.model_checkpoint_path)
# Get the variables specific to the `Adversary`
adv_varlist = {v.name.lstrip("avd/")[:-2]: v …Run Code Online (Sandbox Code Playgroud) 我有一个基于TensorFlow的神经网络和一组变量.
培训功能如下:
def train(load = True, step)
"""
Defining the neural network is skipped here
"""
train_step = tf.train.AdamOptimizer(1e-4).minimize(mse)
# Saver
saver = tf.train.Saver()
if not load:
# Initalizing variables
sess.run(tf.initialize_all_variables())
else:
saver.restore(sess, 'Variables/map.ckpt')
print 'Model Restored!'
# Perform stochastic gradient descent
for i in xrange(step):
train_step.run(feed_dict = {x: train, y_: label})
# Save model
save_path = saver.save(sess, 'Variables/map.ckpt')
print 'Model saved in file: ', save_path
print 'Training Done!'
Run Code Online (Sandbox Code Playgroud)
我正在调用这样的训练函数:
# First train
train(False, 1)
# Following train
for i …Run Code Online (Sandbox Code Playgroud)