问题描述
我已通过以下方式将模型导出到ONNX:
I've exported my model to ONNX via:
# Export the model
torch_out = torch.onnx._export(learn.model, # model being run
x, # model input (or a tuple for multiple inputs)
EXPORT_PATH + "mnist.onnx", # where to save the model (can be a file or file-like object)
export_params=True) # store the trained parameter weights inside the model file
现在我正在尝试将模型转换为Tensorflow Lite文件,以便可以在Android上进行推理.不幸的是,对于Android来说,PyTorch/Caffe2的支持还相当缺乏或过于复杂,但是Tensorflow看起来要简单得多.
And now I am trying to convert the model to a Tensorflow Lite file so that I can do inference on Android. Unfortunately, PyTorch/Caffe2 support is fairly lacking or too complex for Android but Tensorflow appears much simpler.
ONNX to Tflite的文档对此很浅.
The documentation for ONNX to Tflite is pretty light on this.
我尝试通过以下方式导出到Tensorflow GraphDef原型:
I've tried exporting to a Tensorflow GraphDef proto via:
tf_rep.export_graph(EXPORT_PATH + 'mnist-test/mnist-tf-export.pb')
然后运行toco
:
toco \
--graph_def_file=mnist-tf-export.pb \
--input_format=TENSORFLOW_GRAPHDEF \
--output_format=TFLITE \
--inference_type=FLOAT \
--input_type=FLOAT \
--input_arrays=0 \
--output_arrays=add_10 \
--input_shapes=1,3,28,28 \
--output_file=mnist.tflite`
当我这样做时,出现以下错误:
When I do though I get the following error:
File "anaconda3/lib/python3.6/site-packages/tensorflow/lite/python/convert.py", line 172, in toco_convert_protos
"TOCO failed. See console for info.\n%s\n%s\n" % (stdout, stderr))
tensorflow.lite.python.convert.ConverterError: TOCO failed. See console for info.
2018-11-06 16:28:33.864889: I tensorflow/lite/toco/import_tensorflow.cc:1268] Converting unsupported operation: PyFunc
2018-11-06 16:28:33.874130: F tensorflow/lite/toco/import_tensorflow.cc:114] Check failed: attr.value_case() == AttrValue::kType (1 vs. 6)
此外,即使运行命令,我也不知道为input_arrays或output_arrays指定什么,因为该模型最初是在PyTorch中构建的.
Further, even when I run the command I don't know what to specify for the input_arrays or output_arrays since the model was originally built in PyTorch.
有人成功将他们的ONNX模型转换为TFlite吗?
Has anyone successfully converted their ONNX model to TFlite?
这是我要转换的ONNX文件: https://drive.google.com/file/d/1sM4RpeBVqPNw1WeCROpKLdzbSJPWSK79/view?usp=sharing
Here's the ONNX file I'm trying to convert: https://drive.google.com/file/d/1sM4RpeBVqPNw1WeCROpKLdzbSJPWSK79/view?usp=sharing
其他信息
- Python 3.6.6 :: Anaconda自定义(64位)
- onnx.版本 ='1.3.0'
- tf.版本 ='1.13.0-dev20181106'
- 火炬.版本 ='1.0.0.dev20181029'
- Python 3.6.6 :: Anaconda custom (64-bit)
- onnx.version = '1.3.0'
- tf.version = '1.13.0-dev20181106'
- torch.version = '1.0.0.dev20181029'
推荐答案
我认为您提供的ONNX文件(即model.onnx
)已损坏,我不知道这是什么问题,但它没有对onnx运行时进行任何推断.
将模型从protobuffreezeGraph转换为TFlite的最佳方法是使用官方TensorFlow lite转换器文档
I think ONNX file i.e. model.onnx
that you have given is corrupted I don't know what is the issue but it is not doing any inference on onnx runtime.
The best way to convert model from protobuf freezeGraph to TFlite is to use official TensorFlow lite converter documentation
根据TensorFlow 文档 TocoConverter已经存在弃用
According to TensorFlow Docs TocoConverter has been deprecated
从PyTorch转换为ONNX模型
将模型从 Pytorch 转换为 Onnx 的最佳做法是,应添加以下参数以指定 torch中模型的输入和输出层的名称.onnx.export()函数
Convert from PyTorch to ONNX model
The best practise to convert model from Pytorch to Onnx is that you should add following paremeters to specify names of input and ouput layer of your model in torch.onnx.export() function
# Export the model from PyTorch to ONNX
torch_out = torch.onnx._export(model, # model being run
x, # model input (or a tuple for multiple inputs)
EXPORT_PATH + "mnist.onnx", # where to save the model (can be a file or file-like object)
export_params=True, # store the trained parameter weights inside the model file
input_names=['main_input'], # specify the name of input layer in onnx model
output_names=['main_output']) # specify the name of input layer in onnx model
因此,在您的情况下:现在使用onnx-tf
So in your case:Now export this model to tensorFlow protobuf FreezeGraph using onnx-tf
从ONNX转换为TensorFlow Freezgraph
要转换模型,请从下面的命令安装onnx-tf版本1.5.0
Convert from ONNX to TensorFlow freezgraph
To convert model please install onnx-tf version 1.5.0 from below command
pip install onnx-tf==1.5.0
现在要将.onnx模型转换为TensorFlow冻结图,请在shell中运行以下命令
Now to convert .onnx model to TensorFlow freeze graph run this below command in shell
onnx-tf convert -i "mnist.onnx" -o "mnist.pb"
从TensorFlow FreezeGraph .pb转换为TF
现在使用此代码将模型从.pb文件转换为tflite模型
Convert from TensorFlow FreezeGraph .pb to TF
Now to convert this model from .pb file to tflite model use this code
import tensorflow as tf
# make a converter object from the saved tensorflow file
converter = tf.lite.TFLiteConverter.from_frozen_graph('mnist.pb', #TensorFlow freezegraph .pb model file
input_arrays=['main_input'], # name of input arrays as defined in torch.onnx.export function before.
output_arrays=['main_output'] # name of output arrays defined in torch.onnx.export function before.
)
# tell converter which type of optimization techniques to use
converter.optimizations = [tf.lite.Optimize.DEFAULT]
# to view the best option for optimization read documentation of tflite about optimization
# go to this link https://www.tensorflow.org/lite/guide/get_started#4_optimize_your_model_optional
# convert the model
tf_lite_model = converter.convert()
# save the converted model
open('mnist.tflite', 'wb').write(tf_lite_model)
要选择最适合模型使用情况的选项,请参阅有关tensorflow lite优化的官方指南
To choose which option is best for optimization for your model usecase see this official guide about tensorflow lite optimization
https://www.tensorflow.org/lite/guide/get_started#4_optimize_your_model_optional
这篇关于您如何将.onnx转换为tflite?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!