简体   繁体   English

使用 docker 从源构建张量流服务 docker gpu?

[英]Building tensorflow-serving docker gpu from source with docker?

I have followed these steps for building docker from source,我已按照这些步骤从源代码构建 docker,

git clone https://github.com/tensorflow/serving
cd serving

docker build --pull -t $USER/tensorflow-serving-devel-gpu \
  -f tensorflow_serving/tools/docker/Dockerfile.devel-gpu .

docker build -t $USER/tensorflow-serving-gpu \
  --build-arg TF_SERVING_BUILD_IMAGE=$USER/tensorflow-serving-devel-gpu \
  -f tensorflow_serving/tools/docker/Dockerfile.gpu .

These took quite a long time for get compiled and it was completed successfully,这些花费了相当长的时间编译并成功完成,

Now if I check docker images , I see this below response,现在,如果我检查docker images ,我会看到以下响应,

REPOSITORY                          TAG                             IMAGE ID       CREATED             SIZE
root/tensorflow-serving-gpu         latest                          42e221bb6bc9   About an hour ago   8.49GB
root/tensorflow-serving-devel-gpu   latest                          7fd974e5e0c5   2 hours ago         21.8GB
nvidia/cuda                         11.0-cudnn8-devel-ubuntu18.04   7c49b261611b   3 months ago        7.41GB

I have two doubts regarding this,对此我有两个疑问,

  1. Building from source took a large amount of time, and now I want to backup /save these images or containers and save them so I can re-use them later on different machine with same arch.从源代码构建花费了大量时间,现在我想备份/保存这些图像或容器并保存它们,以便以后可以在具有相同拱门的不同机器上重新使用它们。 If you know how to do it, please help me with the commands.如果你知道怎么做,请帮助我使用命令。

  2. Since I completed the build successfully, I need to free up some space by removing unnecessary docker development images used to build tensorflow-serving-gpu?由于我成功完成了构建,我需要通过删除用于构建 tensorflow-serving-gpu 的不必要的 docker 开发图像来释放一些空间? I have three images here which are related to tensorflow serving and I don't know which one to delete?我这里有三张与 tensorflow 服务相关的图片,我不知道要删除哪一张?

If you want to save images:如果要保存图像:

docker save root/tensorflow-serving-gpu:latest  -o  tfs.tar

And if you want to load it:如果你想加载它:

docker load -i tfs.tar

root/tensorflow-serving-gpu and root/tensorflow-serving-devel-gpu are two different images. root/tensorflow-serving-gpuroot/tensorflow-serving-devel-gpu是两个不同的图像。 You can see the differences by looking at the details of Dockerfile.devel-gpu and Dockerfile.gpu .您可以通过查看Dockerfile.devel-gpuDockerfile.gpu的详细信息来了解差异。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Tensorflow-serving docker 容器添加了 GPU 设备,但 GPU 的利用率为 0% - Tensorflow-serving docker container adds the GPU device but GPU has 0% utilization Docker Tensorflow-Serving预测太大 - Docker Tensorflow-Serving Predictions too large 将 docker 运行命令 (tensorflow-serving) 翻译成 docker-compose - translate docker run command (tensorflow-serving) into docker-compose 如何通过 docker 运行包括其模型在内的 tensorflow-serving? - How to run tensorflow-serving including its models by docker? 使用 Docker 从源代码构建 tensorflow - Building tensorflow from source with Docker 在 Windows 上托管 docker 虚拟机时,是否可以将 docker-nvidia 配置为与 tensorflow-serving 一起使用? - Is it possible to configure docker-nvidia for use with tensorflow-serving while hosting the docker virtual machine on Windows? “目标机器主动拒绝”对 tensorflow 服务 Docker 容器的 POST 请求 - "Target machine actively refused" POST request to tensorflow-serving Docker container 当copts包含空格时,如何使用bazel在docker构建图像上交叉编译tensorflow-serving - How to cross-compile tensorflow-serving on docker build image with bazel when copts contain spaces 如何在 docker-compose 中为 tensorflow-serving 指定“model_config_file”变量? - How do I specify the "model_config_file" variable to tensorflow-serving in docker-compose? 无法运行从 docker 服务的 tensorflow - Can not run tensorflow serving from docker
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM