这是我认为从TF示例中错过的方法.
任务:
可以找到每个单独的位,但是我认为将它们放在一个地方将有助于为TF初学者(比如我自己)节省大量时间.
让我们解决1.在我的情况下它是两组图像:
# all filenames for .jpg in dir
# - list of fnames
# - list of labels
def path_fnames(f_path, label, ext = ['.jpg', '.jpeg']):
f_n = [f_path+'/'+f for f in sorted(os.listdir(f_path)) if os.path.splitext(f)[1].lower() in ext]
f_l = [label] * len(f_n)
return f_n, f_l
#
def dense_to_one_hot(labels_dense, num_classes=10, dtype=np.float32):
"""Convert class labels from scalars to one-hot vectors."""
num_labels = labels_dense.shape[0]
index_offset = np.arange(num_labels) * num_classes
labels_one_hot = np.zeros((num_labels, num_classes),dtype=dtype)
labels_one_hot.flat[index_offset + labels_dense.ravel()] = …
Run Code Online (Sandbox Code Playgroud) 问题 - TensorBoard只显示一个图像
灵感来自于 如何在Tensorflow中可视化cnn中的权重(变量)?
这是代码:
# --- image reader ---
# - rsq: random shuffle queue with [fn l] pairs
def img_reader_jpg(rsq):
fn, label = rsq.dequeue()
img_b = tf.read_file(fn)
img_u = tf.image.decode_jpeg(img_b, channels=3)
img_f = tf.cast(img_u, tf.float32)
img_4 = tf.expand_dims(img_f,0)
return img_4, label
# filenames and labels are pre-loaded
fv = tf.constant(fnames)
lv = tf.constant(ohl)
rsq = tf.RandomShuffleQueue(len(fnames), 0, [tf.string, tf.float32])
do_enq = rsq.enqueue_many([fv, lv])
# reading_op
image, label = img_reader_jpg(rsq)
# test: some op
im_t = tf.placeholder(tf.float32, shape=[None,30,30,3], …
Run Code Online (Sandbox Code Playgroud) 基于这个转换训练的张量流模型到protobuf我试图保存/恢复TF图没有成功.
这是救星:
with tf.Graph().as_default():
variable_node = tf.Variable(1.0, name="variable_node")
output_node = tf.mul(variable_node, 2.0, name="output_node")
sess = tf.Session()
init = tf.initialize_all_variables()
sess.run(init)
output = sess.run(output_node)
tf.train.write_graph(sess.graph.as_graph_def(), summ_dir, 'model_00_g.pbtxt', as_text=True)
#self.assertNear(2.0, output, 0.00001)
saver = tf.train.Saver()
saver.save(sess, saver_path)
Run Code Online (Sandbox Code Playgroud)
它产生model_00_g.pbtxt
了文本图形描述.几乎从freeze_graph_test.py复制粘贴.
这是读者:
with tf.Session() as sess:
with tf.Graph().as_default():
graph_def = tf.GraphDef()
graph_path = '/mnt/code/test_00/log/2016-02-11.22-37-46/model_00_g.pbtxt'
with open(graph_path, "rb") as f:
proto_b = f.read()
#print proto_b # -> I can see it
graph_def.ParseFromString(proto_b) # no luck..
_ = tf.import_graph_def(graph_def, name="")
print …
Run Code Online (Sandbox Code Playgroud) Docker安装我看到:
我知道2.&3.有源代码,我现在使用2.
2.&3.有什么区别?哪一个推荐用于"正常"使用?
TLDR:
首先 - 感谢Docker图像!它们是开始使用TF的最简单,最干净的方式.
没有关于图像的事情
tensorflow ×4