Suh*_*shi 15 deep-learning keras tensorflow pytorch onnx
我已通过以下方式将模型导出到 ONNX:
# Export the model
torch_out = torch.onnx._export(learn.model, # model being run
x, # model input (or a tuple for multiple inputs)
EXPORT_PATH + "mnist.onnx", # where to save the model (can be a file or file-like object)
export_params=True) # store the trained parameter weights inside the model file
Run Code Online (Sandbox Code Playgroud)
现在我正在尝试将模型转换为 Tensorflow Lite 文件,以便我可以在 Android 上进行推理。不幸的是,PyTorch/Caffe2 对 Android 的支持相当缺乏或过于复杂,但 Tensorflow 看起来要简单得多。
ONNX 到 Tflite 的文档对此非常清楚。
我尝试通过以下方式导出到 Tensorflow GraphDef proto:
tf_rep.export_graph(EXPORT_PATH + 'mnist-test/mnist-tf-export.pb')
然后运行toco:
toco \
--graph_def_file=mnist-tf-export.pb \
--input_format=TENSORFLOW_GRAPHDEF \
--output_format=TFLITE \
--inference_type=FLOAT \
--input_type=FLOAT \
--input_arrays=0 \
--output_arrays=add_10 \
--input_shapes=1,3,28,28 \
--output_file=mnist.tflite`
Run Code Online (Sandbox Code Playgroud)
当我这样做时,虽然我收到以下错误:
File "anaconda3/lib/python3.6/site-packages/tensorflow/lite/python/convert.py", line 172, in toco_convert_protos
"TOCO failed. See console for info.\n%s\n%s\n" % (stdout, stderr))
tensorflow.lite.python.convert.ConverterError: TOCO failed. See console for info.
2018-11-06 16:28:33.864889: I tensorflow/lite/toco/import_tensorflow.cc:1268] Converting unsupported operation: PyFunc
2018-11-06 16:28:33.874130: F tensorflow/lite/toco/import_tensorflow.cc:114] Check failed: attr.value_case() == AttrValue::kType (1 vs. 6)
Run Code Online (Sandbox Code Playgroud)
此外,即使我运行命令时,我也不知道要为 input_arrays 或 output_arrays 指定什么,因为模型最初是在 PyTorch 中构建的。
有没有人成功地将他们的 ONNX 模型转换为 TFlite?
这是我要转换的 ONNX 文件:https ://drive.google.com/file/d/1sM4RpeBVqPNw1WeCROpKLdzbSJPWSK79/view?usp=sharing
额外信息
Ahw*_*war 12
I think the ONNX file i.e. model.onnx that you have given is corrupted I don't know what is the issue but it is not doing any inference on ONNX runtime.
Now you can run PyTorch Models directly on mobile phones. check out PyTorch Mobile's documentation here
This answer is for TensorFlow version 1,
For TensorFlow version 2 or higher click link
The best way to convert the model from protobuf freezeGraph to TFlite is to use the official TensorFlow lite converter documentation
According to TensorFlow Docs, TocoConverter has been deprecated
This class (tf.compat.v1.lite.TocoConverter) has been deprecated. Please use lite.TFLiteConverter instead.
The best practice to convert the model from Pytorch to Onnx is that you should add the following parameters to specify the names of the input and output layer of your model in torch.onnx.export() function
# Export the model from PyTorch to ONNX
torch_out = torch.onnx._export(model, # model being run
x, # model input (or a tuple for multiple inputs)
EXPORT_PATH + "mnist.onnx", # where to save the model (can be a file or file-like object)
export_params=True, # store the trained parameter weights inside the model file
input_names=['main_input'], # specify the name of input layer in onnx model
output_names=['main_output']) # specify the name of input layer in onnx model
Run Code Online (Sandbox Code Playgroud)
So in your case: Now export this model to TensorFlow protobuf FreezeGraph using onnx-tf
Please note that this method is only working when tensorflow_version < 2
To convert the model please install onnx-tf version 1.5.0 from the below command
pip install onnx-tf==1.5.0
Run Code Online (Sandbox Code Playgroud)
Now to convert .onnx model to TensorFlow freeze graph run this below command in shell
onnx-tf convert -i "mnist.onnx" -o "mnist.pb"
Run Code Online (Sandbox Code Playgroud)
Now to convert this model from .pb file to tflite model use this code
import tensorflow as tf
# make a converter object from the saved tensorflow file
converter = tf.lite.TFLiteConverter.from_frozen_graph('mnist.pb', #TensorFlow freezegraph .pb model file
input_arrays=['main_input'], # name of input arrays as defined in torch.onnx.export function before.
output_arrays=['main_output'] # name of output arrays defined in torch.onnx.export function before.
)
# tell converter which type of optimization techniques to use
converter.optimizations = [tf.lite.Optimize.DEFAULT]
# to view the best option for optimization read documentation of tflite about optimization
# go to this link https://www.tensorflow.org/lite/guide/get_started#4_optimize_your_model_optional
# convert the model
tf_lite_model = converter.convert()
# save the converted model
open('mnist.tflite', 'wb').write(tf_lite_model)
Run Code Online (Sandbox Code Playgroud)
要为您的模型用例选择最适合优化的选项,请参阅有关 TensorFlow lite 优化的官方指南
https://www.tensorflow.org/lite/guide/get_started#4_optimize_your_model_optional
注意:您可以在 Google Colaboratory链接上尝试我的 Jupyter Notebook Convert ONNX model to Tensorflow Lite
| 归档时间: |
|
| 查看次数: |
12159 次 |
| 最近记录: |