简体   繁体   中英

Compiling protobuf messages using python plugin within Google Cloud Build

I've been dealing with this problem for several weeks now and am in dire need of help. So thanks in advance for any insight you might have into how to compile protobufs into pb2.py files such that they are accessible to the rest of your workspace during a Google Cloud Build.

Attempts so far:

  1. I first tried to use the google-cloud-builders, protoc image . I was able to successfully push the image to the builders project registry but I'm not sure if I was able to correctly install the python plugin.

    Here is my cloud_build.yaml step:

     - name: gcr.io/eco-env-238021/protoc args: - --proto-path=./protos - --python_out=./protos -./protos/A.proto

    I kept getting an error reading: failed: starting step container failed: Error response from daemon: OCI runtime create failed: container_linux.go:380: starting container process caused: exec: "protoc": executable file not found in $PATH: unknown

  2. Next I tried using the grpcio-tools python package which is pip installable to compile the protos. This was much more successful because I was actually able to generate the pb2.py files. My excitement was short-lived though. During the cloud build, I call a test file which imports one of the pb2.py files, we'll call it A. Now, I'm getting a module not found error when A imports another pb2.py file, which I'll refer to as B. I've printed out the directory structure within the cloud build environment and both A and B exist and B is definitely accessible to A (they exist within the same package).

Here is the directory structure:

 C:.
  |   cloudbuild.yaml
  |   __init__.py
  |       
  +---protos
  |       A_pb2.py
  |       B_pb2.py
  |       __init__.py
  |       
  +---tests
  |       test.py
Here is my cloud_build.yaml step:
    - name: python:3.7
      args: ["python", "-m", "grpc_tools.protoc",  "-I", "./protos",
             "--python_out=./protos", "./protots/A.proto"]
    - name: python:3.7
      args: ["python", "-m", "grpc_tools.protoc",  "-I", "./protos",
             "--python_out=./protos", "./protots/B.proto"]

    - name: python:3.7
      args: ["python","-m","unittest","discover","--verbose","-s","./tests/",
            "-p","test.py"]
      id: unittest

The module import error is likely completely unrelated to the compilation of the protobuf files and simply a property of the cloud build environment. Something I did notice though, is that if I precompile the protobuf files within the repository that triggers the cloud build, everything works correctly. Or if I follow the same process on my PC everything works as well. I just don't understand how the files can exist but not be importable.

Update: I solved this by making my directory of protobufs (ie protos in this example) a pip installable package because pip installs seems to correctly add packages to PATH in cloud build environment.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM