![](/img/trans.png)
[英]Moving historical data from google cloud storage to date-partitioned bigquery table using python
[英]Exporting BigQuery Table Data to Google Cloud Storage having where clause using python
我想將表數據從 BigQuery 導出到 Google Cloud Storage。 問題是,我需要從 date1 到 date2 的數據,而不是整個表數據。
extract_job = client.extract_table(
table_ref,
destination_uri,
# Location must match that of the source table.
location='US') # API request
extract_job.result()
這是我在谷歌雲幫助中找到的。 沒有空間使用 where 子句添加查詢或限制數據。
不幸的是,這將是兩步過程。 首先你需要建立結果表,然后導出結果。 從成本角度來看,影響應該是最小的 - 您將支付臨時表使用的存儲費用,但成本為每月每 GB 0.02 美元 - 因此,如果您設法在 1 小時內完成任務 - 成本將為每 GB 0.000027 美元
job_config = bigquery.QueryJobConfig()
gcs_filename = 'file_*.gzip'
table_ref = client.dataset(dataset_id).table('my_temp_table')
job_config.destination = table_ref
job_config.write_disposition = bigquery.WriteDisposition.WRITE_TRUNCATE
# Start the query, passing in the extra configuration.
query_job = client.query(
"""#standardSql
select * from `project.dataset.table` where <your_condition> ;""",
location='US',
job_config=job_config)
while not query_job.done():
time.sleep(1)
#check if table successfully written
print("query completed")
job_config = bigquery.ExtractJobConfig()
job_config.compression = bigquery.Compression.GZIP
job_config.destination_format = (
bigquery.DestinationFormat.CSV)
job_config.print_header = False
destination_uri = 'gs://{}/{}'.format(bucket_name, gcs_filename)
extract_job = client.extract_table(
table_ref,
destination_uri,
job_config=job_config,
location='US') # API request
extract_job.result()
print("extract completed")
解決方案:使用 python 將 BigQuery 數據導出到具有 where 子句的 Google Cloud Storage
from google.cloud import bigquery
from google.cloud import storage
def export_to_gcs():
QUERY = "SELECT * FROM TABLE where CONDITION" # change the table and where condition
bq_client = bigquery.Client()
query_job = bq_client.query(QUERY) # BigQuery API request
rows_df = query_job.result().to_dataframe()
storage_client = storage.Client() # Storage API request
bucket = storage_client.get_bucket(BUCKETNAME) # change the bucket name
blob = bucket.blob('temp/Add_to_Cart.csv')
blob.upload_from_string(rows_df.to_csv(sep=';',index=False,encoding='utf-8'),content_type='application/octet-stream')
return "success"
在原生 BigQuery SQL 中使用“EXPORT DATA OPTIONS”命令從 SQL 查詢中導出數據。
使用 python 客戶端將 SQL 提交給 BigQuery,BigQuery 將負責其余的工作。
from google.cloud import bigquery
from google.cloud import storage
BQ = bigquery.Client()
CS = storage.Client()
def gcp_export_http(request):
sql = """
EXPORT DATA OPTIONS(uri="gs://gcs-bucket/*",format='PARQUET',
compression='SNAPPY') AS SELECT * FROM
table_name where column_name > colunn_value
"""
query_job = BQ.query(sql)
res = query_job.result()
return res
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.