[英]Airflow Invalis arguments and keyword argument
Using GCSToBigQueryOperator this error occur使用 GCSToBigQueryOperator 发生此错误
Broken DAG: [/opt/airflow/dags/injest_data.py] Traceback (most recent call last):
File "/opt/airflow/dags/injest_data.py", line 79, in <module>
> "sourceUris": [f"gs://{BUCKET_NAME}/*"],
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 397, in apply_defaults
raise AirflowException(f"missing keyword arguments {display}")
airflow.exceptions.AirflowException: missing keyword arguments 'bucket', 'destination_project_dataset_table','source_objects'****
And when i tried to change to BigQueryCreateExternalTableOperator This other error occur当我尝试更改为 BigQueryCreateExternalTableOperator 时,会发生其他错误
Broken DAG: [/opt/airflow/dags/injest_data.py] Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 411, in apply_defaults
result = func(self, **kwargs, default_args=default_args)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 760, in __init__
f"Invalid arguments were passed to {self.__class__.__name__} (task_id: {task_id}). "
airflow.exceptions.AirflowException: Invalid arguments were passed to BigQueryCreateExternalTableOperator (task_id: bq_external_table_task). Invalid arguments were:
**kwargs: {'tables_resouces': {'tableReferences': {'projectId': 'de-projects-373304', 'datasetId': 'stockmarket_dataset', 'tableId': 'stockmarket_ex
Thanks in advance for your help...在此先感谢您的帮助...
I have tried to change the google query operators and even try to used different method to upload the data to bigquery but says schema dont exist, please i need help to understand what am doing wrong.我曾尝试更改谷歌查询运算符,甚至尝试使用不同的方法将数据上传到 bigquery,但说模式不存在,请帮助我了解哪里做错了。 Thanks in advance for your help, below is the code causing the error
在此先感谢您的帮助,以下是导致错误的代码
bq_external_table_task = BigQueryCreateExternalTableOperator(
task_id = "bq_external_table_task",
tables_resouces = {
"tableReferences": {
"projectId": PROJECT_ID,
"datasetId": BIGQUERY_DATASET,
"tableId":f"{DATASET}_external_table",
},
"externalDataConfiguration": {
"autodetect": True,
"sourceFormat": f"{INPUT_FILETYPE.upper()}",
"sourceUris": [f"gs://{BUCKET_NAME}/*"],
},
},
)
There is no sourceUris
named parameter in GCSToBigQueryOperator . GCSToBigQueryOperator中没有
sourceUris
命名参数。 It should have source_objects
.它应该有
source_objects
。 Kindly check the operator's parameters from below official document: GCSToBigQueryOperator请从以下官方文档中检查运算符的参数: GCSToBigQueryOperator
Your BigQueryCreateExternalTableOperator has also wrong parameter names.您的BigQueryCreateExternalTableOperator也有错误的参数名称。
tables_resouces
should have table_resource
. tables_resouces
应该有table_resource
。 You can also check this operator's parameters from official document: BigQueryCreateExternalTableOperator您也可以从官方文档查看此运算符的参数: BigQueryCreateExternalTableOperator
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.