问题描述
我想获得 TFLite 图的中间层输出.以下内容.
I would like to get intermediate layers output of a TFLite graph. Something in the lines of below.
上述解决方案仅适用于冻结图.由于 SavedModel 是 TF 2.0 中序列化模型的首选方式,我想有一个保存模型的解决方案.我试图将--output_arrays 传递给toco",并将savedModelDir 作为输入.这没有帮助.
The above solution works on frozen graphs only. Since SavedModel is the preferred way of serializing the model in TF 2.0, I would like to have a solution with a saved model. I tried to pass --output_arrays for "toco" with savedModelDir as input. This is not helping.
从文档来看,SavedModel 中的 SignatureDefs 似乎是实现此目的的选项.但是,我无法让它工作.
From the documentation, it looks like SignatureDefs in SavedModel is the option to achieve this. But, I could not get it working.
x = test_images[0:1]
output = model.predict(x, batch_size=1)
signature_def = signature_def_utils.build_signature_def(
inputs={name:"x:0", dtype: DT_FLOAT, tensor_shape: (1, 28,28, 1)})
outputs = {{name: "output:0", dtype: DT_FLOAT, tensor_shape: (1, 10)},
{name:"Dense_1:0", dtype: DT_FLOAT, tensor_shape: (1, 10)}})
tf.saved_model.save(model, './tf-saved-model-sigdefs', signature_def)
您能分享一个为此目的使用 SignatureDefs 的示例吗?顺便说一句,我一直在玩这个实验的以下教程.https://www.tensorflow.org/beta/tutorials/images/intro_to_cnns
Can you share an example usage of SignatureDefs for this purpose?BTW, I have been playing around with the below tutorial for this experiment.https://www.tensorflow.org/beta/tutorials/images/intro_to_cnns
推荐答案
以下是我目前最好的解决方案.
Below is the best solution that I have so far.
from __future__ import absolute_import, division, print_function
#!pip install -q tensorflow==2.0.0-alpha0
import tensorflow as tf
from tensorflow.keras import datasets, layers, models
(train_images, train_labels), (test_images, test_labels) = datasets.mnist.load_data()
train_images = train_images.reshape((60000, 28, 28, 1))
test_images = test_images.reshape((10000, 28, 28, 1))
# Normalize pixel values to be between 0 and 1
train_images, test_images = train_images / 255.0, test_images / 255.0
model = models.Sequential()
model.add(layers.Conv2D(32, (3, 3), activation='relu', input_shape=(28, 28, 1)))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.MaxPooling2D((2, 2)))
model.add(layers.Conv2D(64, (3, 3), activation='relu'))
model.add(layers.Flatten())
model.add(layers.Dense(64, activation='relu'))
model.add(layers.Dense(10, activation='softmax'))
model.summary()
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
model.fit(train_images, train_labels, epochs=5)
test_loss, test_acc = model.evaluate(test_images, test_labels)
#Saving in saved_model format
tf.keras.experimental.export_saved_model(model, './tf-saved-model')
#Saving in TFLIte FlatBUffer format
tflite_mnist_model = "mnist_model.tflite"
converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite_model = converter.convert()
open(tflite_mnist_model, "wb").write(tflite_model)
#cloning Keras model with debug outputs
outputs = [model.get_layer("dense").output, model.get_layer("dense_1").output]
model_debug = tf.keras.Model(inputs=model.inputs, outputs=outputs)
#Saving in TFLIte FlatBUffer format with debug information
tflite_mnist_model = "mnist_model_debug.tflite"
converter = tf.lite.TFLiteConverter.from_keras_model(model_debug)
tflite_model = converter.convert()
open(tflite_mnist_model, "wb").write(tflite_model)
这篇关于SavedModel - TFLite - SignatureDef - TensorInfo - 获取中间层输出的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!