[英]Transferring data from VM Instances to BigQuery in GCP
I am trying to transfer some files to BigQuery which are stored in my VM Instances.我正在尝试将一些文件传输到存储在我的 VM 实例中的 BigQuery。 Normally we do a two steps process:
通常我们做一个两步的过程:
Now, I want to take files directly from VM Instances to BigQuery platform.现在,我想将文件直接从 VM 实例带到 BigQuery 平台。 Is there any way to do it?
有什么办法吗?
You can load data directly from a readable data source (such as your local machine) by using:您可以使用以下方法直接从可读数据源(例如本地计算机)加载数据:
Please, follow the official documentation to see examples of using each way.请按照官方文档查看每种方式的使用示例。
Moreover, if you want to stay with idea of sending your files to Cloud Storage bucket, you can think about using Dataflow templates:此外,如果您想保留将文件发送到 Cloud Storage 存储桶的想法,您可以考虑使用 Dataflow 模板:
which allows you to read text files stored in Cloud Storage, transform them using a JavaScript User Defined Function (UDF) that you provide, and output the result to BigQuery.它允许您读取存储在 Cloud Storage 中的文本文件,使用您提供的 JavaScript 用户定义 Function (UDF) 和 output 将结果转换为It is automated solution.
它是自动化的解决方案。
I hope you find the above pieces of information useful.我希望您发现上述信息有用。
The solution would be to use bq command for this.解决方案是为此使用 bq 命令。 The command would be like this: bq load --autodetect --source_format=CSV xy abc.csv
命令如下: bq load --autodetect --source_format=CSV xy abc.csv
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.