I have an image in Artifact Registry that does a unit of work:
it expects input files inside a certain directory (let's call it main_input)
runs them, does some sequences of computation, and outputs results into an output folder in Google Storage
The run time of each does not exceed 30 minutes, but I have thousands of such runs to perform.
Inside a single VM, I can create various containers from this image by mounting the main_input directory inside the container to the correct ones on the host, and run.
However, I wonder if Cloud Run is a more scalable solution for this? or shall I look at other services/strategies?
Managing thousands of runs is not an easy task, you can use a scheduler like airflow or argo worfkflows to run the tasks and restart them if needed.
For the containers environment, I propose Kubernetes (GKE) over Cloud Run, for some reason:
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.