I tried creating a Data Transfer Service using bigquery_datatransfer. I used the following python library,
pip install --upgrade google-cloud-bigquery-datatransfer
Used the method
create_transfer_config(parent, transfer_config)
I have defined the transfer_config values for the data_source_id: amazon_s3
transfer_config = {
"destination_dataset_id": "My Dataset",
"display_name": "test_bqdts",
"data_source_id": "amazon_s3",
"params": {
"destination_table_name_template":"destination_table_name",
"data_path": <data_path>,
"access_key_id": args.access_key_id,
"secret_access_key": args.secret_access_key,
"file_format": <>
},
"schedule": "every 10 minutes"
}
But while running the script I'm getting the following error,
ValueError: Protocol message Struct has no "destination_table_name_template" field.
The fields given inside the params are not recognized. Also, I couldn't find what are the fields to be defined inside the "params" struct
What are the fields to be defined inside the "params" of transfer_config to create the Data Transfer job successfully?
As you can see in the documentation , you should try putting your code inside the google.protobuf.json_format.ParseDict()
function.
transfer_config = google.protobuf.json_format.ParseDict(
{
"destination_dataset_id": dataset_id,
"display_name": "Your Scheduled Query Name",
"data_source_id": "scheduled_query",
"params": {
"query": query_string,
"destination_table_name_template": "your_table_{run_date}",
"write_disposition": "WRITE_TRUNCATE",
"partitioning_field": "",
},
"schedule": "every 24 hours",
},
bigquery_datatransfer_v1.types.TransferConfig(),
)
Please let me know if it helps you
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.