[英]Loading csv data to bigquery using python in Terraform
Read csv file and load it to bigquery through dataflow job - use python coding for this instead of templates How perform this task using terraform(GCP) anyone help读取 csv 文件并通过数据流作业将其加载到 bigquery - 为此使用 python 编码而不是模板如何使用 terraform(GCP)执行此任务任何帮助
I trying to do it but not understanding what terraform script should I write for it我尝试这样做但不明白我应该为它编写什么 terraform 脚本
It's not the responsability of Terraform
to deploy a Dataflow
job.部署
Dataflow
作业不是Terraform
的责任。
There is only a Terraform
resource to instantiate a Dataflow
template只有
Terraform
资源实例化一个Dataflow
模板
You can deleguate this to your CI CD.您可以将其委托给您的 CI CD。
Example with Beam
Python
: Beam
Python
:
Beam
Python
Beam
开发 Job Python
Python
Beam
code to a Cloud Storage
bucketPython
Beam
代码部署到Cloud Storage
桶Dataflow
job and main file with Python
command line Python
Dataflow
行运行数据流作业和主文件Example with Beam
Java
and mvn
compile
:使用
Beam
Java
和mvn
compile
的示例:
Beam
Java
and Maven or GradleBeam
开发工作Java
和 Maven 或 Gradlemvn compile
command to execute the Dataflow
jobmvn compile
命令来执行Dataflow
作业Example with Beam
Java
and a fat
jar
: Beam
Java
和fat
jar
的示例:
Beam
Java
and Maven or GradleBeam
开发工作Java
和 Maven 或 Gradlefat
jar
fat
jar
fat
jar
to a Cloud Storage
bucketfat
jar
部署到Cloud Storage
桶Dataflow
job and the Main
inside the fat jar with java -jar
commandjava -jar
命令在 fat jar 中运行Dataflow
作业和Main
Example with Beam
Python
and Airflow
/ Cloud Composer
:使用
Beam
Python
和Airflow
/ Cloud Composer
的示例:
Beam
Python
Beam
开发 Job Python
Python
Beam
code to the Cloud Composer
bucket with gcloud composer
gcloud composer
将Python
Beam
代码部署到Cloud Composer
存储桶Airflow
code, uses BeamRunPythonPipelineOperator
to instantiate the Dataflow
jobAirflow
代码中,使用BeamRunPythonPipelineOperator
实例化Dataflow
作业Airflow
DAG
to run the Dataflow
jobAirflow
DAG
以运行Dataflow
作业Example with Beam
Java
and Airflow
/ Cloud Composer
: Beam
Java
和Airflow
/ Cloud Composer
示例:
Beam
Java
Beam
开发 Job Java
fat
jar
fat
jar
fat
jar
to a Cloud Storage
bucketfat
jar
部署到Cloud Storage
桶Airflow
code, uses BeamRunJavaPipelineOperator
to instantiate the Dataflow
job targeting on the path of the fat
jar
Airflow
代码中,使用BeamRunJavaPipelineOperator
实例化Dataflow
作业,目标指向fat
jar
的路径Airflow
DAG
to run the Dataflow
jobAirflow
DAG
以运行Dataflow
作业
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.