简体   繁体   English

如何在TF网站中设置Tensorflow Serving?

[英]How to setup Tensorflow Serving as in TF website?

I am trying to learn and setup TF serving as described here 我正在尝试按照此处所述学习和设置TF服务

I trained my model using retrain.py file and all necessary model files are in "saved_models" folder 我使用retrain.py文件训练了模型,所有必需的模型文件都在“ saved_models”文件夹中 在此处输入图片说明

Now, I want to update the server command 现在,我想更新服务器命令

    # Start TensorFlow Serving container and open the REST API port
docker run -t --rm -p 8501:8501 \
   -v "$TESTDATA/saved_model_half_plus_two_cpu:/models/half_plus_two" \
   -e MODEL_NAME=half_plus_two \
   tensorflow/serving &

I have a hard time understanding to change the example path to my current setup. 我很难理解将示例路径更改为当前设置。 I don't understand how "/models/half_plus_two" got there since it's not available in the example folders. 我不明白“ / models / half_plus_two”是如何到达那里的,因为它在示例文件夹中不可用。

This is what I did here 这就是我在这里所做的

docker run -t --rm -p 8501:8501 -v "c:/tmp/saved_models:/models/1" -e MODEL_NAME=c:\tmp\saved_models\1\saved_model.pb

But it fails 但是失败了

C:\tmp\serving>docker run -t --rm -p 8501:8501 -v "c:/tmp/saved_models:/models/1" -e MODEL_NAME=c:\tmp\saved_models\1\saved_model.pb
"docker run" requires at least 1 argument.
See 'docker run --help'.

Usage:  docker run [OPTIONS] IMAGE [COMMAND] [ARG...]

Run a command in a new container

Any help is really appreciated. 任何帮助都非常感谢。

Did you docker pull the tensorflow/serving image? 您是否将docker拉到tensorflow / serving图像? Also, you need to provide docker run an "image"(the one that you pulled before). 另外,您需要提供给docker运行一个“映像”(您之前拉过的映像)。

docker pull tensorflow/serving

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何将 tf.example 发送到 TensorFlow Serving gRPC 预测请求中 - How to send a tf.example into a TensorFlow Serving gRPC predict request 如何解决tf_serving_entrypoint.sh:第3:6条使用张量流/服务图像时的非法指令(核心转储) - How to solve tf_serving_entrypoint.sh: line 3: 6 Illegal instruction (core dumped) when using tensorflow/serving image 使用 TF 2.0 为 Tensorflow/Keras model 提供嵌入层问题 - Issue with embedding layer when serving a Tensorflow/Keras model with TF 2.0 难以格式化tf.example以向Tensorflow服务发出请求 - Difficulty formatting tf.example for making requests to Tensorflow serving 保存自定义tf.estimator训练模型的tensorflow服务 - Saving a custom tf.estimator trained model for tensorflow serving 如何在Tensorflow服务中进行批处理? - How to do batching in Tensorflow Serving? 如何部署tensorflow系统? 张量流服务? - how to deploy tensorflow system? tensorflow serving? 在Tensorflow 1.3中导出和加载tf.contrib.estimator以便在python中进行预测而无需使用Tensorflow Serving - Exporting and loading tf.contrib.estimator in Tensorflow 1.3 for prediction in python without using Tensorflow Serving TF服务错误:加载软件包'@ org_tensorflow // tensorflow / python / keras'时出错: - Error in TF serving: error loading package '@org_tensorflow//tensorflow/python/keras': Tensorflow服务 - Tensorflow serving
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM