blu*_*ers 5 python tensorflow tensorflow-datasets
我已经使用tf.data.DatasetAPI 训练了模型,所以我的训练代码看起来像这样
with graph.as_default():
dataset = tf.data.TFRecordDataset(tfrecord_path)
dataset = dataset.map(scale_features, num_parallel_calls=n_workers)
dataset = dataset.shuffle(10000)
dataset = dataset.padded_batch(batch_size, padded_shapes={...})
handle = tf.placeholder(tf.string, shape=[])
iterator = tf.data.Iterator.from_string_handle(handle,
train_dataset.output_types,
train_dataset.output_shapes)
batch = iterator.get_next()
...
# Model code
...
iterator = dataset.make_initializable_iterator()
with tf.Session(graph=graph) as sess:
train_handle = sess.run(iterator.string_handle())
sess.run(tf.global_variables_initializer())
for epoch in range(n_epochs):
sess.run(train_iterator.initializer)
while True:
try:
sess.run(optimizer, feed_dict={handle: train_handle})
except tf.errors.OutOfRangeError:
break
Run Code Online (Sandbox Code Playgroud)
现在,在训练完模型之后,我想推断出数据集中没有的示例,而且我不确定该怎么做。
明确地说,我知道如何使用另一个数据集,例如,我只是在测试时将句柄传递给测试集。
The question is about given the scaling scheme and the fact that the network expects a handle, if I want to make a prediction to a new example which is not written to a TFRecord, how would I go about doing that?
If I'd modify the batch I'd be responsible for the scaling beforehand which is something I would like to avoid if possible.
So how should I infer single examples from a model traiend the tf.data.Dataset way?
(This is not for production purposes it is for evaluating what will happen if I change specific features)
实际上,当您使用数据集API时,图中有一个名为“IteratorGetNext:0”的张量名称,因此您可以使用以下方式直接设置输入:
#get a tensor from a graph
input tensor : input = graph.get_tensor_by_name("IteratorGetNext:0")
# difine the target tensor you want evaluate for your prediction
prediction tensor: predictions=...
# finally call session to run
then sess.run(predictions, feed_dict={input: np.asanyarray(images), ...})
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
1354 次 |
| 最近记录: |