本文介绍了如何从本地系统加载 TF hub 模型的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
一种方法是每次从 tensorflow_hub
下载模型,如下所示
One way is to download the model each time from tensorflow_hub
like following
import tensorflow as tf
import tensorflow_hub as hub
hub_url = "https://tfhub.dev/google/tf2-preview/nnlm-en-dim128/1"
embed = hub.KerasLayer(hub_url)
embeddings = embed(["A long sentence.", "single-word", "http://example.com"])
print(embeddings.shape, embeddings.dtype)
我想下载一次文件并反复使用,而不必每次都下载
I want to download the file once and use again and again with out downloading each time
推荐答案
- 从 url + ?tf-hub-format=compressed"下载您的模型
例如https://tfhub.dev/google/tf2-preview/nnlm-en-dim128/1?tf-hub-format=compressed" - 解压
- 在代码中加载解压后的文件夹
import tensorflow as tf
import tensorflow_hub as hub
embed = hub.KerasLayer('path/to/untarred/folder')
embeddings = embed(["A long sentence.", "single-word", "http://example.com"])
print(embeddings.shape, embeddings.dtype)
这篇关于如何从本地系统加载 TF hub 模型的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!