[英]Cloud Dataflow job reading from one Bigquery project and writing to another BigQuery project
I'm implementing a Cloud Dataflow job on GCP that needs to deal with 2 GCP projects.我正在 GCP 上实施一个 Cloud Dataflow 作业,需要处理 2 个 GCP 项目。 Both input and output are Bigquery partitionned tables.
输入和 output 都是 Bigquery 分区表。 The issue I'm going through now is that I must read data from a project A and write it into a project B.
我现在遇到的问题是我必须从项目 A 中读取数据并将其写入项目 B。
I havent seen anything related to cross project service accounts and I can't give Dataflow two different credential key either which is a bit annoying?我还没有看到与跨项目服务帐户相关的任何内容,我也不能给 Dataflow 两个不同的凭据密钥,这有点烦人吗? I don't know if someone else went through that kind of architecture or how you dealt with it.
我不知道其他人是否经历过这种架构,或者你是如何处理的。
I think you can accomplish this with the following steps:我认为您可以通过以下步骤完成此操作:
It is very simple.这很简单。 you need to give required permission/access to your service account from both the project.
您需要从两个项目中授予对您的服务帐户的所需权限/访问权限。
So you need only service account which has required access/permission in both the project因此,您只需要在两个项目中都需要访问/权限的服务帐户
Hope it helps.希望能帮助到你。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.