如何在预测之间保持张量流会话开放?从SavedModel加载

bw4*_*4sz 10 python tensorflow

我训练了一个张量流模型,我想从numpy数组运行预测.这适用于视频中的图像处理.我会将图像传递给模型.并非每一帧都通过.

我在这样的会话中重新加载我的SavedModel

def run(self):                
    with tf.Session(graph=tf.Graph()) as sess:
        tf.saved_model.loader.load(sess,
                    [tf.saved_model.tag_constants.SERVING], "model")
Run Code Online (Sandbox Code Playgroud)

如果我将图像列表(self.tfimages)传递给预测,我的代码将完美运行.凝聚于:

    softmax_tensor = sess.graph.get_tensor_by_name('final_ops/softmax:0')
    predictions = sess.run(softmax_tensor, {'Placeholder:0': self.tfimages})
Run Code Online (Sandbox Code Playgroud)

但我不会立刻拥有所有图像.我是否真的必须每次从文件重新加载模型(需要2分钟以上).

我想做这样的事情

class tensorflow_model:
def __init__(self):                
    with tf.Session(graph=tf.Graph()) as self.sess:
        tf.saved_model.loader.load(self.sess,
                    [tf.saved_model.tag_constants.SERVING], "model")
def predict(self):

        # Feed the image_data as input to the graph and get first prediction
        softmax_tensor = self.sess.graph.get_tensor_by_name('final_ops/softmax:0')

        predictions = self.sess.run(softmax_tensor, {'Placeholder:0': self.tfimages})
Run Code Online (Sandbox Code Playgroud)

但收益率

builtins.RuntimeError:尝试使用已关闭的会话

有没有办法保持会话打开,或者可能独立于会话加载SavedModel?

编辑我尝试了第一个答案,分两步创建一个会话:

sess=tf.Session(graph=tf.Graph())
sess
<tensorflow.python.client.session.Session object at 0x0000021ACBB62EF0>
tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
Traceback (most recent call last):
  Debug Probe, prompt 138, line 1
  File "C:\Program Files\Python35\Lib\site-packages\tensorflow\python\saved_model\loader_impl.py", line 222, in load
    saver.restore(sess, variables_path)
  File "C:\Program Files\Python35\Lib\site-packages\tensorflow\python\training\saver.py", line 1428, in restore
    {self.saver_def.filename_tensor_name: save_path})
  File "C:\Program Files\Python35\Lib\site-packages\tensorflow\python\client\session.py", line 774, in run
    run_metadata_ptr)
  File "C:\Program Files\Python35\Lib\site-packages\tensorflow\python\client\session.py", line 905, in _run
    raise RuntimeError('The Session graph is empty.  Add operations to the '
builtins.RuntimeError: The Session graph is empty.  Add operations to the graph before calling run().
Run Code Online (Sandbox Code Playgroud)

with tf.Session(graph=tf.Graph()) as sess:
    tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
Run Code Online (Sandbox Code Playgroud)

执行没有错误.

至于将sess作为变量传递给class的第二个想法,这是一个很好的.这有效:

with tf.Session(graph=tf.Graph()) as sess:
    tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
    tensorflow_instance=tensorflow(read_from="file")
    tensorflow_instance.predict(sess)
Run Code Online (Sandbox Code Playgroud)

但事实并非如此

sess=tf.Session(graph=tf.Graph())
tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
tensorflow_instance=tensorflow(read_from="file")
tensorflow_instance.predict(sess)
Run Code Online (Sandbox Code Playgroud)

将我的程序包装到with as sess语句中会非常尴尬.

完整代码:

import tensorflow as tf
import sys
from google.protobuf import text_format
from tensorflow.core.framework import graph_pb2
import os
import glob

class tensorflow:    

def __init__(self,read_from):

    #frames to be analyzed
    self.tfimages=[]    

    find_photos=glob.glob("*.jpg")

    # Read in the image_data
    if read_from=="file":
        for x in find_photos:
            image_data = tf.gfile.FastGFile(x, 'rb').read()    
            self.tfimages.append(image_data)

    # Loads label file, strips off carriage return
    self.label_lines = [line.rstrip() for line in tf.gfile.GFile("dict.txt")]

def predict(self,sess):

    # Feed the image_data as input to the graph and get first prediction
    softmax_tensor = sess.graph.get_tensor_by_name('final_ops/softmax:0')

    predictions = sess.run(softmax_tensor, {'Placeholder:0': self.tfimages})
    for prediction in predictions:
        # Sort to show labels of first prediction in order of confidence
        top_k = prediction.argsort()[-len(prediction):][::-1]

        for node_id in top_k:
            human_string = self.label_lines[node_id]
            score = prediction[node_id]
            print('%s (score = %.5f)' % (human_string, score))
        return(human_string)

if __name__ == "__main__":
    with tf.Session(graph=tf.Graph()) as sess:
        tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
        tensorflow_instance=tensorflow(read_from="file")
        tensorflow_instance.predict(sess)

    sess=tf.Session(graph=tf.Graph())
    tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
    tensorflow_instance=tensorflow(read_from="file")
    tensorflow_instance.predict(sess)
Run Code Online (Sandbox Code Playgroud)

rha*_*l80 7

其他人已经解释了为什么你不能把你的会话放在with构造函数中的语句中.

您在使用上下文管理器时看到不同行为的原因是因为tf.saved_model.loader.load默认图表与作为会话一部分的图表之间存在一些奇怪的交互.

解决方案很简单; 如果您没有在with块中使用它,请不要将图表传递给会话:

sess=tf.Session()
tf.saved_model.loader.load(sess,[tf.saved_model.tag_constants.SERVING], "model")
Run Code Online (Sandbox Code Playgroud)

以下是类进行预测的示例代码:

class Model(object):

  def __init__(self, model_path):
    # Note, if you don't want to leak this, you'll want to turn Model into
    # a context manager. In practice, you probably don't have to worry
    # about it.
    self.session = tf.Session()

    tf.saved_model.loader.load(
        self.session,
        [tf.saved_model.tag_constants.SERVING],
        model_path)

    self.softmax_tensor = self.session.graph.get_tensor_by_name('final_ops/softmax:0')

  def predict(self, images):
    predictions = self.session.run(self.softmax, {'Placeholder:0': images})
    # TODO: convert to human-friendly labels
    return predictions


images = [tf.gfile.FastGFile(f, 'rb').read() for f in glob.glob("*.jpg")]
model = Model('model_path')
print(model.predict(images))

# Alternatively (uses less memory, but has lower throughput):
for f in glob.glob("*.jpg"):
  print(model.predict([tf.gfile.FastGFile(f, 'rb').read()]))
Run Code Online (Sandbox Code Playgroud)