简体   繁体   English

读取美国地区 GCS 存储桶以将数据写入欧洲地区存储桶的最佳方法是什么

[英]what is the best way to read US region GCS bucket to write the data into Europe region bucket

The need is to read GCS bucket(bucket 1) mapped to US region and then write into Europe region GCS bucket(bucket 2).需要读取映射到美国区域的 GCS 存储桶(存储桶 1),然后写入欧洲区域的 GCS 存储桶(存储桶 2)。 I would like to find the best solution to achieve the requirement.我想找到实现要求的最佳解决方案。 I thought to implement bucket-1 as multiregional but I could see below options, not sure how it works If I choose anyone.我想将 bucket-1 实现为多区域,但我可以看到以下选项,不确定如果我选择任何人它是如何工作的。

在此处输入图像描述

can anyone please suggest the solution.谁能建议解决方案。 Note: The requirement is bucket-1 should always be with US region.注意:要求 bucket-1 应始终位于美国区域。

You can't have multi/dual region across continent.您不能在整个大陆拥有多/双区域。 So you must have 2 different buckets, not only one.所以你必须有 2 个不同的桶,而不是只有一个。

Then you have to implement solution to replicate data.然后你必须实施解决方案来复制数据。 There is storage transfer that offer a convenient way to achieve that, but you don't have a lot of control on the trigger.存储传输提供了一种方便的方法来实现这一点,但您对触发器没有太多控制权。

You can also implement yourself a copy based on GCS event and a function that simply call a copy API. Not really difficult to implement, but not out-of-the-box您还可以自己实现一个基于GCS 事件的副本和一个 function ,它简单地调用一个副本 API。实现起来并不难,但不是开箱即用的

This is how I made it work.这就是我让它工作的方式。 I have created a bucket-1 in US multi regional and bucket-2 in europe-west3 regional.我在美国多区域创建了 bucket-1,在 europe-west3 区域创建了 bucket-2。 That's all, my cloud function able to read from bucket-1 and written into bucket-2 successfully.就是这样,我的云 function 能够成功地从 bucket-1 读取并写入 bucket-2。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 将存储桶创建限制在一个区域 - Restrict bucket creation to an region 通过带有外部 IP 的美国 GCE VM 从多区域美国存储桶中读取的成本 - Cost of reading from mulit-region US bucket via US GCE VM with external IP DAG 间歇性地无法从 GCS 存储桶中的“data/”中读取文件 - DAGs are Intermittently unable to read files from `data/` in the GCS bucket 跨区域移动 - 将 Big Query 和 GCS 数据从美国传输到欧盟位置 - Cross-region movement - Transferring Big Query and GCS data from US to EU locations 允许对 GCS 存储桶进行公共读取访问? - Allow Public Read access on a GCS bucket? 快速删除 GCS 存储桶上的大文件夹的方法 - Fast way to delete big folder on GCS bucket 为多个用户提供对 GCS 存储桶中对象的访问的编程方式 - programmatic way to provide access to a object in GCS bucket for multiple users 使用 gcsfuse 从 Cloud Storage Bucket 读取/写入数据 - Read/Write data to/from Cloud Storage Bucket using gcsfuse 使用数据工厂将数据从 Azure cosmos 美区迁移到欧区 - Migrate Data from Azure cosmos US region to European Region using data factory 当 EC2 实例和 S3 存储桶位于同一区域时,如果我们通过 EC2 实例从 S3 存储桶中获取数据,CloudFront 是否有用? - Does CloudFront is useful if we fetch data from S3 bucket through EC2 Instance when EC2 Instance and S3 bucket are in Same Region?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM