0估计器模型转换为tensorflow

0估计器模型转换为tensorflow

本文介绍了如何将tensorflow 2.0估计器模型转换为tensorflow lite?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

以下我下面的代码生成常规的tensorflow模型,但是当我尝试将其转换为tensorflow lite无效时,我遵循了以下文档.

THe following code i have below produce the regular tensorflow model but when i try to convert it to tensorflow lite it doesn't work, i followed the following documentations.

https://www.tensorflow.org/tutorials/estimator/linear 1 https://www.tensorflow.org/lite/guide/get_started

https://www.tensorflow.org/tutorials/estimator/linear1https://www.tensorflow.org/lite/guide/get_started

export_dir = "tmp"
serving_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(
  tf.feature_column.make_parse_example_spec(feat_cols))

estimator.export_saved_model(export_dir, serving_input_fn)

# Convert the model.
converter = tf.lite.TFLiteConverter.from_saved_model("tmp/1571728920/saved_model.pb")
tflite_model = converter.convert()

错误消息

Traceback (most recent call last):
  File "C:/Users/Dacorie Smith/PycharmProjects/JamaicaClassOneNotifableModels/ClassOneModels.py", line 208, in <module>
    tflite_model = converter.convert()
  File "C:\Users\Dacorie Smith\PycharmProjects\JamaicaClassOneNotifableModels\venv\lib\site-packages\tensorflow_core\lite\python\lite.py", line 400, in convert
    raise ValueError("This converter can only convert a single "
ValueError: This converter can only convert a single ConcreteFunction. Converting multiple functions is under development.

从文档中提取

以下示例显示了正在转换的TensorFlow SavedModel转换为TensorFlow Lite格式:

The following example shows a TensorFlow SavedModel being converted into the TensorFlow Lite format:

将tensorflow导入为tf

import tensorflow as tf

转换器= tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)tflite_model = converter.convert()open("converted_model.tflite","wb").write(tflite_model)

converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir) tflite_model = converter.convert() open("converted_model.tflite", "wb").write(tflite_model)

推荐答案

尝试使用具体功能:

export_dir = "tmp"
serving_input_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(
  tf.feature_column.make_parse_example_spec(feat_cols))

estimator.export_saved_model(export_dir, serving_input_fn)

# Convert the model.
saved_model_obj = tf.saved_model.load(export_dir="tmp/1571728920/")
concrete_func = saved_model_obj.signatures['serving_default']

converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func])

# print(saved_model_obj.signatures.keys())
# converter.optimizations = [tf.lite.Optimize.DEFAULT]
# converter.experimental_new_converter = True

tflite_model = converter.convert()

serving_default 是SavedModels中签名的默认密钥.

serving_default is the default key for signatures in a SavedModels.

如果不起作用,请尝试取消注释 converter.experimental_new_converter = True 及其上方的两行.

If not working try to uncomment converter.experimental_new_converter = True and the two lines above it.

简短说明

基于具体功能指南

TensorFlow 2中急切的执行会立即评估操作,而无需构建图形.要保存模型,您需要将图形包装在python可调用对象中:一个具体函数.

Eager execution in TensorFlow 2 evaluates operations immediately, without building graphs.To save the model you need graph/s which is wrapped in a python callables: a concrete functions.

这篇关于如何将tensorflow 2.0估计器模型转换为tensorflow lite?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-13 12:49