繁体   English   中英

Python - BigQuery 临时表

[英]Python - BigQuery Temporary Table

是否可以使用 Python 将 Cloud Storage 中已有的数据导入到 bigquery 中的临时表? 我可以在 Python 中创建 BigQuery 临时表并将数据插入其中吗?

您只能创建临时表作为 bigquery 脚本或存储过程的一部分。

您可以做的是创建具有随机后缀名称和短期到期的表。 在我的例子中一小时。 示例函数创建临时表,只需要一个数据集作为参数。

from google.cloud import bigquery
import datetime, pytz, random

PROJECT = "myproject"


def get_temp_table(dataset: str, table_name: str = None, project=None) -> bigquery.Table:
    prefix = "temp"
    suffix = random.randint(10000, 99999)
    if not table_name:
        table_name = "noname"

    temp_table_name = f"{dataset}.{prefix}_{table_name}_{suffix}"
    if project:
        temp_table_name = f"{project}.{temp_table_name}"
    tmp_table_def = bigquery.Table(temp_table_name)
    tmp_table_def.expires = datetime.datetime.now(pytz.utc) + datetime.timedelta(
        hours=1
    )

    return tmp_table_def


client = bigquery.Client(project=PROJECT)

tmp_table_def = get_temp_table("mydataset", "new_users", project=PROJECT)
tmp_table_def.schema = [
    bigquery.SchemaField("id", "STRING", mode="REQUIRED"),
    bigquery.SchemaField("full_name", "STRING", mode="REQUIRED"),
    bigquery.SchemaField("age", "INTEGER", mode="REQUIRED"),
]
tmp_table = client.create_table(tmp_table_def)  # type: bigquery.Table

data = [
    {"id": "c-1234", "full_name": "John Smith", "age": 39},
    {"id": "c-1234", "full_name": "Patricia Smith", "age": 41},
]

errors = client.insert_rows(tmp_table, data)

print(f"Loaded {len(data)} rows into {tmp_table.dataset_id}:{tmp_table.table_id} with {len(errors)} errors")

(这个草案不考虑临时表,但我认为可以提供帮助。)我将它与谷歌云函数和 Python 3.7 一起使用并且工作正常。

from google.cloud import storage,bigquery
import json
import os
import csv
import io
import pandas as pd

def upload_dataframe_gbq(df,table_name):
    bq_client = bigquery.Client()
    dataset_id = 'your_dataset_id'
    dataset_ref = bq_client.dataset(dataset_id)
    table_ref = dataset_ref.table(table_name)
    job = bq_client.load_table_from_dataframe(df, table_ref)
    job.result()  # Waits for table load to complete.
    assert job.state == "DONE"
    table = bq_client.get_table(table_ref)
    print(table.num_rows)


os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="your_credentials.json"
client = storage.Client()
bucket = client.get_bucket('your_bucket_name')
blob = bucket.blob('sample.csv')
content = blob.download_as_string()
csv_content = BytesIO(content)
df = pd.read_csv(csv_content, sep=",", header=0 )
table_name = "your_big_query_table_name"
upload_dataframe_gbq(df,table_name)

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM