[英]what is the best way to read US region GCS bucket to write the data into Europe region bucket
The need is to read GCS bucket(bucket 1) mapped to US region and then write into Europe region GCS bucket(bucket 2).需要读取映射到美国区域的 GCS 存储桶(存储桶 1),然后写入欧洲区域的 GCS 存储桶(存储桶 2)。 I would like to find the best solution to achieve the requirement.我想找到实现要求的最佳解决方案。 I thought to implement bucket-1 as multiregional but I could see below options, not sure how it works If I choose anyone.我想将 bucket-1 实现为多区域,但我可以看到以下选项,不确定如果我选择任何人它是如何工作的。
can anyone please suggest the solution.谁能建议解决方案。 Note: The requirement is bucket-1 should always be with US region.注意:要求 bucket-1 应始终位于美国区域。
You can't have multi/dual region across continent.您不能在整个大陆拥有多/双区域。 So you must have 2 different buckets, not only one.所以你必须有 2 个不同的桶,而不是只有一个。
Then you have to implement solution to replicate data.然后你必须实施解决方案来复制数据。 There is storage transfer that offer a convenient way to achieve that, but you don't have a lot of control on the trigger.存储传输提供了一种方便的方法来实现这一点,但您对触发器没有太多控制权。
You can also implement yourself a copy based on GCS event and a function that simply call a copy API. Not really difficult to implement, but not out-of-the-box您还可以自己实现一个基于GCS 事件的副本和一个 function ,它简单地调用一个副本 API。实现起来并不难,但不是开箱即用的
This is how I made it work.这就是我让它工作的方式。 I have created a bucket-1 in US multi regional and bucket-2 in europe-west3 regional.我在美国多区域创建了 bucket-1,在 europe-west3 区域创建了 bucket-2。 That's all, my cloud function able to read from bucket-1 and written into bucket-2 successfully.就是这样,我的云 function 能够成功地从 bucket-1 读取并写入 bucket-2。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.