简体   繁体   English

将数据备份(AZ 文件存储)复制到 GCP 存储

[英]Copy data backups (AZ File storage) to GCP Storage

What is the best solution to transfer a huge amount of data (~few TB) from Azure File Storage to GCP Storage?.将大量数据(〜几 TB)从 Azure 文件存储传输到 GCP 存储的最佳解决方案是什么? It's not an Azure Blob Storage, so I can`t use AzCopy and Data transfer option on the GCP side.它不是 Azure Blob 存储,所以我不能在 GCP 端使用 AzCopy 和数据传输选项。 Is there any other 'easy' way to transfer that kind of data?有没有其他“简单”的方式来传输这种数据?

AFAIK, unfortunately there is no service in Azure that allows to copy data from File Storage to GCP Storage. AFAIK,不幸的是,Azure 中没有允许将数据从文件存储复制到 GCP 存储的服务。 Even in Azure Data Factory, which is highly used to copy data, GCP Storage can only be used as source but not as a sink, ie, you can only copy data from GCP Storage to Azure storage.即使在Azure数据工厂这个高度用于复制数据的数据工厂中,GCP Storage也只能作为source而不能作为sink,即只能将数据从GCP Storage复制到Azure storage。 Please refer document for more details on ADF supported data stores .有关ADF 支持的数据存储的更多详细信息,请参阅文档。

So, the alternative is you can copy the data to Blob Storage and then copy to GCP Storage using Google cloud data transfer.因此,另一种方法是您可以将数据复制到 Blob 存储,然后使用 Google 云数据传输复制到 GCP 存储。

在此处输入图像描述

There's an docker image of an agent: https://hub.docker.com/repository/docker/securitasmachina2022/securitasmachinaoffsiteagent有一个代理的 docker 图像: https://hub.docker.com/repository/securitasmachinasiteoffagent2022/

Do note there's a 5TiB file size limit on google storage buckets请注意,谷歌存储桶的文件大小限制为 5TiB

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM