简体   繁体   English

从变量将数据插入BQ表时动态处理Bigquery表架构

[英]Dynamic Handing of Bigquery table schema while inserting data into BQ table from variable

I am trying to append data to BQ table using python code which requires dynamic schema handling. 我正在尝试使用需要动态架构处理的python代码将数据追加到BQ表中。 Can anyone provide me the link to handle above scenario. 谁能为我提供处理上述情况的链接。

An example code of loading a .csv file into BigQuery using the python client library: 使用python客户端库将.csv文件加载到BigQuery中的示例代码:

# from google.cloud import bigquery
# client = bigquery.Client()
# filename = '/path/to/file.csv'
# dataset_id = 'my_dataset'
# table_id = 'my_table'

dataset_ref = client.dataset(dataset_id)
table_ref = dataset_ref.table(table_id)
job_config = bigquery.LoadJobConfig()
job_config.source_format = bigquery.SourceFormat.CSV
job_config.skip_leading_rows = 1
job_config.autodetect = True

with open(filename, "rb") as source_file:
    job = client.load_table_from_file(source_file, table_ref, job_config=job_config)

job.result()  # Waits for table load to complete.

print("Loaded {} rows into {}:{}.".format(job.output_rows, dataset_id, table_id))

Also check this part of the documentation to know more about appending data into tables from a source file using the same or different schema. 另请参阅文档的此部分,以了解有关使用相同或不同模式将数据从源文件追加到表中的更多信息。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 在 BQ 中读取 csv 数据时如何动态更改表架构? - How to dynamically change the table schema while reading csv data in BQ? 从任何 BigQuery 表读取数据并写入目标 BQ 表时如何获取结果行数(使用 bigquery.QueryJobConfig())? - How to get resultant row count when reading data from any BigQuery table and write to destination BQ table (Using bigquery.QueryJobConfig())? 向谷歌云中的大查询表插入数据时出错? - error while inserting data to a bigquery table in google cloud? BQ 表的模式转换 - 列数据类型的更改 - Schema conversion of a BQ table - Change of columns data type 使用 bq mk 命令从 GCS 存储模式在 BQ 中创建空表时出错 - Error using bq mk command for creating empty table in BQ from GCS stored schema 如何从 GCS 将数据加载到 BigQuery(使用 load_table_from_uri 或 load_table_from_dataframe)而不复制 BQ 表中的现有数据 - how to load data into BigQuery from GCS (using load_table_from_uri or load_table_from_dataframe) without duplicating existing data in the BQ table GCP - 从 BQ 例程中的文件和变量加载表 - GCP - load table from file and variable in BQ routine 超出速率限制:将数据从 pubsub 主题上传到 BQ 时,云 Function 中此表的表更新操作过多 - Exceeded rate limits: too many table update operations for this table in Cloud Function while uploading the data from pubsub topic to BQ 来自 GCS 的 BQ 表的大小 - Size of a BQ Table from GCS 将数据从 BigQuery 表加载到 Dataproc 集群时出错 - Error while loading data from BigQuery table to Dataproc cluster
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM