简体   繁体   English

如何在不进行任何手动操作的情况下将数据从 Google 云存储传输到 S3?

[英]How to Transfer data from Google cloud storage to S3 without any manual activity?

I want to transfer data from my cloud storage bucket to S3.我想将数据从我的云存储桶传输到 S3。 I did a lot of research.我做了很多研究。 I found an interesting article in which this can be done using gsutil command.我发现了一篇有趣的文章,其中可以使用 gsutil 命令完成此操作。 Again it requires a manual activity.同样,它需要手动操作。 Is there any way to do this without any manual activity or like any possibility to run this gsutil command through Java API?有没有办法在没有任何手动活动的情况下执行此操作,或者有可能通过 Java API 运行此 gsutil 命令?

I have created a VM instance on my Google cloud project.I have configured the boto file which has my AWS credentials.Using gsutil command I was able to copy my data from cloud storage to s3.I want to automate this step我在我的 Google 云项目上创建了一个 VM 实例。我已经配置了具有我的 AWS 凭证的 boto 文件。使用 gsutil 命令我能够将我的数据从云存储复制到 s3。我想自动执行此步骤

gsutil can work with both Google Storage and S3. gsutil可以与Google Storage和S3一起使用。

gsutil rsync -d -r gs://my-gs-bucket s3://my-s3-bucket

You just need to configure it with both - Google and your AWS S3 credentials. 您只需要使用Google和您的AWS S3凭证进行配置。 gsutil would use credentials from ~/.boto file. gsutil将使用~/.boto文件中的凭据。

There are multiple ways to automate shell command execution. 有多种方法可以自动执行Shell命令。 From Java's Runtime.exec() to Google Cloud Scheduler (your cloud's cron) 从Java的Runtime.exec()到Google Cloud Scheduler (您的云的cron)

Fully automated and easy to use way would be Google Storage Transfer . 全自动和易于使用的方式将是Google Storage Transfer It would let you select the source and target buckets in either S3 or GCS and set shedule such as "run daily at 1am" 它可以让您在S3或GCS中选择来源和目标存储段,并设置诸如“每天凌晨1点运行”

Skyplane is a much faster alternative for transferring data between clouds (up to 110x for large files). Skyplane是一种在云之间传输数据的更快替代方案(大文件最高可达 110 倍)。 You can transfer data with the command:您可以使用以下命令传输数据:

skyplane cp -r s3://aws-bucket-name/ gcs://google-bucket-name/

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 从 GCP(Amazon S3 到谷歌云)中的存储传输服务传输数据是否有文件大小限制? - Is there any file size limit to transfer data from Storage Transfer service in GCP(Amazon S3 to google cloud)? 将数据从 Google Cloud Storage 导出到 Amazon S3 - Exporting data from Google Cloud Storage to Amazon S3 将 s3 数据迁移到谷歌云存储 - Migrate s3 data to google cloud storage 想要在没有互联网的情况下将文件从私有本地实例 (VM) 传输到谷歌云存储 - Want to transfer file from private on-premise Instance (VM) to google cloud storage without internet 将大型数据集从 Amazon s3 传输到 Azure blob 存储 - Transfer large Datasets to Azure blob storage from Amazon s3 如何将数据从 SAP HANA 传输到 AWS s3 - How to transfer data from SAP HANA to AWS s3 如何将 BigQuery 视图作为 csv 文件传输到 Google Cloud Storage 存储桶 - How to Transfer a BigQuery view to a Google Cloud Storage bucket as a csv file 无需下载即可访问 Google Cloud Storage 中的数据 - Accessing data in Google Cloud Storage without downloading it Google Cloud Storage Bucket 无法传入数据 - Google Cloud Storage Bucket can't transfer data in 如何在没有 rclone 的情况下以编程方式直接从 Digital Ocean Storage 上传到 Google Cloud Storage - How to upload from Digital Ocean Storage to Google Cloud Storage, directly, programatically without rclone
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM