简体   繁体   English

将 BigQuery 数据传输到 SFTP 的最佳方式是什么?

[英]What is the best way to transfer BigQuery data to SFTP?

The data is a few hundred Mb up to a few Gb.数据从几百 Mb 到几 Gb。 It could be running some BQ procedures and in the end a select.它可能正在运行一些 BQ 程序,最后是一个选择。 The values of this need to be transferred as a valid CSV to an SFTP.此值需要作为有效的 CSV 传输到 SFTP。

Cloud functions could be problematic because of the 9 minute timeout limit and the 2Gb RAM limit.由于 9 分钟超时限制和 2Gb RAM 限制,云功能可能会出现问题。

Is there a serverless solution or do I have to run manual instances?有无服务器解决方案还是我必须运行手动实例?

There are two scenarios I would consider:我会考虑两种情况:

  1. Export table with standard BQ options ( here ) into GCP bucket storage .使用标准 BQ 选项(此处)将表导出到GCP 存储桶存储中 Then you can pick it up and upload it to SFTP by Cloud Run .然后,您可以通过Cloud Run将其拾取并上传到 SFTP。 There are containers, which are built for this, eg this one有容器,这是为此而构建的,例如这个
  2. Run a pipelining project.运行流水线项目。 Considering you want to use simple export, I would suggest Dataflow .考虑到您想使用简单导出,我建议使用Dataflow You can write a small Python or Java code to pick up a file and upload it to SFTP.您可以编写一个小的 Python 或 Java 代码来获取文件并将其上传到 SFTP。 If you would like more complex logic in processing - have a look at Dataproc如果您想要更复杂的处理逻辑 - 看看Dataproc

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM