简体   繁体   English

如何使用 Python 创建 BigQuery 数据传输服务

[英]How to create BigQuery Data Transfer Service using Python

I tried creating a Data Transfer Service using bigquery_datatransfer.我尝试使用 bigquery_datatransfer 创建数据传输服务。 I used the following python library,我使用了以下python库,

pip install --upgrade google-cloud-bigquery-datatransfer pip install --upgrade google-cloud-bigquery-datatransfer

Used the method使用的方法

create_transfer_config(parent, transfer_config) create_transfer_config(父,transfer_config)

I have defined the transfer_config values for the data_source_id: amazon_s3我已经为data_source_id定义了 transfer_config 值:amazon_s3

transfer_config = {
    "destination_dataset_id": "My Dataset",
    "display_name": "test_bqdts",
    "data_source_id": "amazon_s3",
    "params": {
        "destination_table_name_template":"destination_table_name",
        "data_path": <data_path>,
        "access_key_id": args.access_key_id,
        "secret_access_key": args.secret_access_key,
        "file_format": <>
    },
    "schedule": "every 10 minutes"
}

But while running the script I'm getting the following error,但是在运行脚本时出现以下错误,

ValueError: Protocol message Struct has no "destination_table_name_template" field.

The fields given inside the params are not recognized.无法识别参数中给出的字段。 Also, I couldn't find what are the fields to be defined inside the "params" struct另外,我找不到要在“params”结构中定义的字段是什么

What are the fields to be defined inside the "params" of transfer_config to create the Data Transfer job successfully?要成功创建数据传输作业,在 transfer_config 的“params”中定义哪些字段?

As you can see in the documentation , you should try putting your code inside the google.protobuf.json_format.ParseDict() function.正如您在文档中所见,您应该尝试将代码放入google.protobuf.json_format.ParseDict()函数中。

transfer_config = google.protobuf.json_format.ParseDict(
    {
        "destination_dataset_id": dataset_id,
        "display_name": "Your Scheduled Query Name",
        "data_source_id": "scheduled_query",
        "params": {
            "query": query_string,
            "destination_table_name_template": "your_table_{run_date}",
            "write_disposition": "WRITE_TRUNCATE",
            "partitioning_field": "",
        },
        "schedule": "every 24 hours",
    },
    bigquery_datatransfer_v1.types.TransferConfig(),
)

Please let me know if it helps you如果对您有帮助,请告诉我

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何使用 Python + 服务帐户创建 BigQuery 数据传输? - How to create BigQuery Data Transfer w/ Python + Service Account? 如何使用bigquery_datatransfer Python客户端创建AdWords BigQuery传输和传输运行? - How do you create a Adwords BigQuery Transfer and Transfer Runs using the bigquery_datatransfer Python client? 在 python 中使用谷歌数据流进行 Bigquery 到 Bigtable 的数据传输 - Bigquery to Bigtable data transfer using google dataflow in python BigQuery数据传输服务-python客户端库 - BigQuery Data Transfer Services - python client library BigQuery到Hadoop群集-如何传输数据? - BigQuery to Hadoop Cluster - How to transfer data? Bigquery:如果不存在则创建表,并使用Python和Apache AirFlow加载数据 - Bigquery : Create table if not exist and load data using Python and Apache AirFlow 如何使用以下python bigquery api创建表? - how to create a table using the following python bigquery api? 如何使用python API在bigquery中创建新视图? - How can I create a new view in bigquery using the python API? 如何使用 Python API 创建覆盖目标表的 BigQuery TransferConfig - How to create a BigQuery TransferConfig that overwrites the destination tables using the Python API 使用 Python bigquery_datatransfer 创建 Adwords BigQuery Transfer:错误 400 请求包含无效参数 - Create adwords BigQuery Transfer with Python bigquery_datatransfer: error 400 Request contains an invalid argument
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM