简体   繁体   中英

Which GCP service(s) is the best for my use case?

I have an image in Artifact Registry that does a unit of work:

  • it expects input files inside a certain directory (let's call it main_input)

  • runs them, does some sequences of computation, and outputs results into an output folder in Google Storage

The run time of each does not exceed 30 minutes, but I have thousands of such runs to perform.

Inside a single VM, I can create various containers from this image by mounting the main_input directory inside the container to the correct ones on the host, and run.

However, I wonder if Cloud Run is a more scalable solution for this? or shall I look at other services/strategies?

Managing thousands of runs is not an easy task, you can use a scheduler like airflow or argo worfkflows to run the tasks and restart them if needed.

For the containers environment, I propose Kubernetes (GKE) over Cloud Run, for some reason:

  • you have more permissions than Cloud Run
  • to be agnostic from GCP (your code can work on other platforms like AWS)
  • better management for apps configurations and secrets, and supported with all the CI/CD tools
  • less expensive: you can create scalable preemptible node pools to reduce the cost and add a lot of resources when needed
  • you can use the same cluster to run other applications for your company
  • you can use open source tools for log collections, monitoring, HTTP reverse proxy, ...

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM