简体   繁体   中英

From Big Query to Google cloud storage

I like to export data from Big query to Google cloud storage using any script. Also for multiple table using loop save in CSV format and overwrite existing file.

Also how can we schedule this script.

If anybody have answer that will be great help.

Thanks in advance

Common way to approach this problem is to use Airflow and write a DAG to meet your requirements.

But if you want to iterate tables and dump them in GCS on a regular basis only with BigQuery, following could be another option.

1. Export Data

You can export data to GCS with EXPORT DATA statement in BigQuery script.

EXPORT DATA OPTIONS(
  uri='gs://bucket/folder/*.csv',
  format='CSV',
  overwrite=true,
  header=true,
  field_delimiter=';') AS
SELECT field1, field2 FROM mydataset.table1 ORDER BY field1 LIMIT 10

2. Loops and Dynamic SQL

If you have a list of table you want to dump, you can loop those tables in BigQuery FOR loop.

And you need to generate EXPORT DATA script dynamically for each table. To do so, you can use EXECUTE IMMEDIATE Dynamic SQL.

3. Scheduling

BigQuery provides a feature to schedule a user query and you can use it for your purpose.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM