本文介绍了“未找到SavedModel捆绑包!"将tensorflow_hub模型部署到AWS SageMaker的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图将通用句子编码器模型部署到aws Sagemaker端点,并收到错误raise ValueError('no SavedModel bundles found!')

I attempting to deploy the universal-sentence-encoder model to a aws Sagemaker endpoint and am getting the error raise ValueError('no SavedModel bundles found!')

我在下面显示了我的代码,我感觉我的一条路径不正确

I have shown my code below, I have a feeling that one of my paths is incorrect

import tensorflow as tf
import tensorflow_hub as hub
import numpy as np
from sagemaker import get_execution_role
from sagemaker.tensorflow.serving import Model

def tfhub_to_savedmodel(model_name,uri):
    tfhub_uri = uri
    model_path = 'encoder_model/' + model_name

    with tf.Session(graph=tf.Graph()) as sess:
        module = hub.Module(tfhub_uri)
        input_params = module.get_input_info_dict()
        dtype = input_params['text'].dtype
        shape = input_params['text'].get_shape()

        # define the model inputs
        inputs = {'text': tf.placeholder(dtype, shape, 'text')}

        # define the model outputs
        # we want the class ids and probabilities for the top 3 classes
        logits = module(inputs['text'])
        outputs = {
            'vector': logits,
        }

        # export the model
        sess.run([tf.global_variables_initializer(), tf.tables_initializer()])
        tf.saved_model.simple_save(
            sess,
            model_path,
            inputs=inputs,
            outputs=outputs)

    return model_path


sagemaker_role = get_execution_role()

!tar -C "$PWD" -czf encoder.tar.gz encoder_model/
model_data = Session().upload_data(path='encoder.tar.gz',key_prefix='model')

env = {'SAGEMAKER_TFS_DEFAULT_MODEL_NAME': 'universal-sentence-encoder-large'}

model = Model(model_data=model_data, role=sagemaker_role, framework_version=1.12, env=env)
predictor = model.deploy(initial_instance_count=1, instance_type='ml.t2.medium')

推荐答案

我想您是从此示例开始的? https://github.com/awslabs/亚马逊-sagemaker-examples/tree/master/sagemaker-python-sdk/tensorflow_serving_container

I suppose you started from this example? https://github.com/awslabs/amazon-sagemaker-examples/tree/master/sagemaker-python-sdk/tensorflow_serving_container

您似乎没有正确保存TF服务捆绑包:由于以下原因,缺少模型版本号:

It looks like you're not saving the TF Serving bundle properly: the model version number is missing, because of this line:

model_path = 'encoder_model/' + model_name

用这个替换它应该可以解决您的问题:

Replacing it with this should fix your problem:

model_path = '{}/{}/00000001'.format('encoder_model/', model_name)

您的模型伪像应如下所示(我在上面的笔记本中使用了该模型):

Your model artefact should look like this (I used the model in the notebook above):

mobilenet/
mobilenet/mobilenet_v2_140_224/
mobilenet/mobilenet_v2_140_224/00000001/
mobilenet/mobilenet_v2_140_224/00000001/saved_model.pb
mobilenet/mobilenet_v2_140_224/00000001/variables/
mobilenet/mobilenet_v2_140_224/00000001/variables/variables.data-00000-of-00001
mobilenet/mobilenet_v2_140_224/00000001/variables/variables.index

然后,上传到S3并进行部署.

Then, upload to S3 and deploy.

这篇关于“未找到SavedModel捆绑包!"将tensorflow_hub模型部署到AWS SageMaker的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-28 21:41