简体   繁体   English

如何构建用于Tensorflow服务的models.config

[英]How to builf models.config for tensorflow serving

I want to host multiple models on tensorflow serving . 我想在tensorflow服务上托管多个模型。 Tensorflow serving documents states that : Create a file containing an ASCII ModelServerConfig protocol buffer. Tensorflow服务文档指出:创建一个包含ASCII ModelServerConfig协议缓冲区的文件。 But I am not able to build this models.config . 但是我无法建立这个models.config。 Please help me with this 请在这件事上给予我帮助

As mentioned in the documentation you are referring to, you need to create a plaintext file in the Google ProtoBuf structure. 如您所参考的文档中所述,您需要在Google ProtoBuf结构中创建一个纯文本文件。 Then, you save this file, and you pass a model_config_file flag to the model server executable with the path to the file you have just created. 然后,保存此文件,然后将model_config_file标志传递给模型服务器可执行文件,并带有您刚刚创建的文件的路径。 If you are running docker, there is a section in GtHub README dedicated to that. 如果您正在运行docker,则GtHub 自述文件中有专门针对此部分的内容。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM