简体   繁体   English

使用 terraform 清空 GCP 数据流作业

[英]Drain GCP dataflow jobs with terraform

When deploying a dataflow job with terraform it does not drain the old job.使用 terraform 部署数据流作业时,它不会耗尽旧作业。 Is there a possibility to automatically drain it and the deploy?有没有可能自动耗尽它和部署?

I think it's not the responsability of Terraform to deploy custom Dataflow jobs (not Dataflow templates).我认为部署自定义Dataflow作业(不是Dataflow模板)不是Terraform的责任。

It's more the responsability of the deployment task proposed by a CI CD pipeline.它更多的是CI CD管道提出的部署任务的责任。

In this case there is no possibility to automatically drain and deploy a new version of the job.在这种情况下,不可能自动耗尽和部署作业的新版本。

You have to develop your own script ( Shell for example) to do that and apply your strategy.您必须开发自己的脚本(例如Shell )才能执行此操作并应用您的策略。

For example, a Dataflow job can be drained by a gcloud command:例如,可以gcloud Dataflow 耗尽数据流作业:

gcloud dataflow jobs drain JOB_ID

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM