[英]Which GCP service(s) is the best for my use case?
I have an image in Artifact Registry that does a unit of work:我在 Artifact Registry 中有一个图像,它执行一个工作单元:
it expects input files inside a certain directory (let's call it main_input)它需要某个目录中的输入文件(我们称之为 main_input)
runs them, does some sequences of computation, and outputs results into an output folder in Google Storage运行它们,执行一些计算序列,并将结果输出到 Google Storage 中的 output 文件夹中
The run time of each does not exceed 30 minutes, but I have thousands of such runs to perform.每个的运行时间不超过 30 分钟,但我有数千次这样的运行要执行。
Inside a single VM, I can create various containers from this image by mounting the main_input directory inside the container to the correct ones on the host, and run.在单个 VM 中,我可以通过将容器内的 main_input 目录挂载到主机上的正确目录来从该映像创建各种容器,然后运行。
However, I wonder if Cloud Run is a more scalable solution for this?但是,我想知道 Cloud Run 是否是一个更具可扩展性的解决方案? or shall I look at other services/strategies?
还是我应该看看其他服务/策略?
Managing thousands of runs is not an easy task, you can use a scheduler like airflow or argo worfkflows to run the tasks and restart them if needed.管理数千次运行并非易事,您可以使用 airflow 或 argo worfkflows 之类的调度程序来运行任务并在需要时重新启动它们。
For the containers environment, I propose Kubernetes (GKE) over Cloud Run, for some reason:对于容器环境,出于某种原因,我建议在 Cloud Run 上使用 Kubernetes (GKE):
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.