简体   繁体   English

Airflow 无效 arguments 和关键字参数

[英]Airflow Invalis arguments and keyword argument

Using GCSToBigQueryOperator this error occur使用 GCSToBigQueryOperator 发生此错误

Broken DAG: [/opt/airflow/dags/injest_data.py] Traceback (most recent call last):
File "/opt/airflow/dags/injest_data.py", line 79, in <module>
>     "sourceUris": [f"gs://{BUCKET_NAME}/*"],
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 397, in apply_defaults
raise AirflowException(f"missing keyword arguments {display}")
airflow.exceptions.AirflowException: missing keyword arguments 'bucket', 'destination_project_dataset_table','source_objects'****

And when i tried to change to BigQueryCreateExternalTableOperator This other error occur当我尝试更改为 BigQueryCreateExternalTableOperator 时,会发生其他错误

Broken DAG: [/opt/airflow/dags/injest_data.py] Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 411, in apply_defaults
result = func(self, **kwargs, default_args=default_args)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 760, in __init__
f"Invalid arguments were passed to {self.__class__.__name__} (task_id: {task_id}). "
airflow.exceptions.AirflowException: Invalid arguments were passed to BigQueryCreateExternalTableOperator (task_id: bq_external_table_task). Invalid arguments were:
**kwargs: {'tables_resouces': {'tableReferences': {'projectId': 'de-projects-373304', 'datasetId': 'stockmarket_dataset', 'tableId': 'stockmarket_ex

Thanks in advance for your help...在此先感谢您的帮助...

I have tried to change the google query operators and even try to used different method to upload the data to bigquery but says schema dont exist, please i need help to understand what am doing wrong.我曾尝试更改谷歌查询运算符,甚至尝试使用不同的方法将数据上传到 bigquery,但说模式不存在,请帮助我了解哪里做错了。 Thanks in advance for your help, below is the code causing the error在此先感谢您的帮助,以下是导致错误的代码

    bq_external_table_task = BigQueryCreateExternalTableOperator(
            task_id = "bq_external_table_task",
            tables_resouces = {
                "tableReferences": {
                    "projectId": PROJECT_ID,
                    "datasetId": BIGQUERY_DATASET,
                    "tableId":f"{DATASET}_external_table",
                },
                "externalDataConfiguration": {
                    "autodetect": True,
                    "sourceFormat": f"{INPUT_FILETYPE.upper()}",
                    "sourceUris": [f"gs://{BUCKET_NAME}/*"],
                },
            },
            
        )

There is no sourceUris named parameter in GCSToBigQueryOperator . GCSToBigQueryOperator中没有sourceUris命名参数。 It should have source_objects .它应该有source_objects Kindly check the operator's parameters from below official document: GCSToBigQueryOperator请从以下官方文档中检查运算符的参数: GCSToBigQueryOperator

Your BigQueryCreateExternalTableOperator has also wrong parameter names.您的BigQueryCreateExternalTableOperator也有错误的参数名称。 tables_resouces should have table_resource . tables_resouces应该有table_resource You can also check this operator's parameters from official document: BigQueryCreateExternalTableOperator您也可以从官方文档查看此运算符的参数: BigQueryCreateExternalTableOperator

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 AirFlow Python 运算符错误:得到一个意外的关键字参数 'conf' - AirFlow Python operator error: got an unexpected keyword argument 'conf' Ansible - 得到一个意外的关键字参数 - check_invalid_arguments - Ansible - got an unexpected keyword argument - check_invalid_arguments 将命令行 arguments 提交到 airflow 上的 pyspark 作业 - Submit command line arguments to a pyspark job on airflow 错误:sink() 得到了一个意外的关键字参数 'parent' - Error: sink() got an unexpected keyword argument ‘parent’ TypeError: request() 得到了一个意外的关键字参数 'json' - PYTHON,AWS - TypeError: request() got an unexpected keyword argument 'json' - PYTHON,AWS 传递给调用的参数在映射 firebase 提取调用时不带 arguments - Argument passed to call that takes no arguments when mapping firebase fetch call 使用 Airflow 导出 BigQuery 表:“extract_table() 缺少 1 个必需的位置参数:‘self’;1764166”错误 - Exporting a BigQuery table with Airflow: "extract_table() missing 1 required positional argument: 'self'; 1764166" error python boto3 ecr start_image_scan 出错,显示“[ERROR] TypeError:start_image_scan() 仅接受关键字 arguments。” - python boto3 ecr start_image_scan erroring out with "[ERROR] TypeError: start_image_scan() only accepts keyword arguments." Airflow 连接单个 DAG - Airflow connection for a single DAG Airflow 2 中的 S3KeySensor - S3KeySensor in Airflow 2
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM