TensorFlow 0.12模型文件

mha*_*hat 4 c++ artificial-intelligence deep-learning tensorflow tensorflow-serving

我训练模型并使用以下方法保存:

saver = tf.train.Saver()
saver.save(session, './my_model_name')
Run Code Online (Sandbox Code Playgroud)

除了检查点文件,它只包含指向模型最新检查点的指针,这将在当前路径中创建以下3个文件:

  1. my_model_name.meta
  2. my_model_name.index
  3. my_model_name.data 00000-的-00001

我想知道每个文件包含什么.

我想在C++中加载这个模型并运行推理.该label_image示例加载从单一的模型.bp使用文件ReadBinaryProto().我想知道如何从这3个文件中加载它.以下是什么C++等价物?

new_saver = tf.train.import_meta_graph('./my_model_name.meta')
new_saver.restore(session, './my_model_name')
Run Code Online (Sandbox Code Playgroud)

Mar*_*cka 6

您的保护程序创建的名称为"Checkpoint V2",并在TF 0.12中引入.

我的工作非常好(虽然C++部分的文档非常糟糕,所以我花了一天时间来解决).有些人建议将所有变量转换为常量冻结图形,但实际上并不需要这些变量.

Python部分(保存)

with tf.Session() as sess:
    tf.train.Saver(tf.trainable_variables()).save(sess, 'models/my-model')
Run Code Online (Sandbox Code Playgroud)

如果你创建了Saverwith tf.trainable_variables(),你可以节省一些头痛和存储空间.但也许一些更复杂的模型需要保存所有数据,然后删除此参数Saver,只需确保在创建图形Saver 创建.给所有变量/层赋予唯一名称也是非常明智的,否则你可以运行不同的问题.

C++部分(推理)

请注意,这checkpointPath不是任何现有文件的路径,只是它们的公共前缀.如果你错误地把路径放到.index文件中,TF就不会告诉你这是错误的,但是由于未初始化的变量,它会在推理期间死亡.

#include <tensorflow/core/public/session.h>
#include <tensorflow/core/protobuf/meta_graph.pb.h>

using namespace std;
using namespace tensorflow;

...
// set up your input paths
const string pathToGraph = "models/my-model.meta"
const string checkpointPath = "models/my-model";
...

auto session = NewSession(SessionOptions());
if (session == nullptr) {
    throw runtime_error("Could not create Tensorflow session.");
}

Status status;

// Read in the protobuf graph we exported
MetaGraphDef graph_def;
status = ReadBinaryProto(Env::Default(), pathToGraph, &graph_def);
if (!status.ok()) {
    throw runtime_error("Error reading graph definition from " + pathToGraph + ": " + status.ToString());
}

// Add the graph to the session
status = session->Create(graph_def.graph_def());
if (!status.ok()) {
    throw runtime_error("Error creating graph: " + status.ToString());
}

// Read weights from the saved checkpoint
Tensor checkpointPathTensor(DT_STRING, TensorShape());
checkpointPathTensor.scalar<std::string>()() = checkpointPath;
status = session->Run(
        {{ graph_def.saver_def().filename_tensor_name(), checkpointPathTensor },},
        {},
        {graph_def.saver_def().restore_op_name()},
        nullptr);
if (!status.ok()) {
    throw runtime_error("Error loading checkpoint from " + checkpointPath + ": " + status.ToString());
}

// and run the inference to your liking
auto feedDict = ...
auto outputOps = ...
std::vector<tensorflow::Tensor> outputTensors;
status = session->Run(feedDict, outputOps, {}, &outputTensors);
Run Code Online (Sandbox Code Playgroud)

为了完整性,这里是Python的等价物:

Python中的推理

with tf.Session() as sess:
    saver = tf.train.import_meta_graph('models/my-model.meta')
    saver.restore(sess, tf.train.latest_checkpoint('models/'))
    outputTensors = sess.run(outputOps, feed_dict=feedDict)
Run Code Online (Sandbox Code Playgroud)