![](/img/trans.png)
[英]Cloud Function Error runQuery() missing 1 required positional argument
[英]Exporting a BigQuery table with Airflow: "extract_table() missing 1 required positional argument: 'self'; 1764166" error
我正在嘗試使用 airflow 任務將 BigQuery 表導出到 Google Cloud Storage,但收到以下錯誤消息:
{standard_task_runner.py:93} ERROR - Failed to execute job 1819 for task export_objs_table_to_bucket (extract_table() missing 1 required positional argument: 'self'; 1764166)
這是我創建的 airflow 任務:
task2_objs = PythonOperator(
task_id = "export_objs_table_to_bucket",
python_callable = callables.bq_export_to_gcs,
op_kwargs = {
"bucket": BUCKET,
"blob": BLOB_OBJ,
"project": PROJECT,
"dataset_id": DATASET_ID,
"table_id": TABLE_ID_OBJS
}
)
And this is the callable Python function that I use in that task, which, by the way, I took from the GCP documentation: https://cloud.google.com/bigquery/docs/samples/bigquery-extract-table?hl =zh-419
def bq_export_to_gcs(bucket: str,
blob: str,
project: str,
dataset_id: str,
table_id: str) -> None:
client = bigquery.Client
destination_uri = f"gs://{bucket}/{blob}"
dataset_ref = bigquery.DatasetReference(project, dataset_id)
table_ref = dataset_ref.table(table_id)
extract_job = client.extract_table(
source = table_ref,
destination_uris=destination_uri,
location="europe-west1",
)
extract_job.result()
我真的不知道它是否與 Airflow 或 BigQuery 有關,但正如我所提到的,我正在這樣做,因為它是在 BigQuery 文檔中寫的,所以我完全迷失了。
我能夠使用您提供的代碼復制您的用例。 問題是您無法正確創建client
,因為它沒有正確實例化。
您應該將代碼行client = bigquery.Client
更新為client = bigquery.Client()
問題已解決,但無論如何我想展示另一種解決方案。
您還可以使用現有運算符將表提取到GCS
:
將表導出到 CSV 文件的示例,其中,
作為字段分隔符。
BigQueryToGCSOperator(
task_id='task_id',
source_project_dataset_table='my_project.my_dataset.my_table',
export_format='CSV',
destination_cloud_storage_uris=[
'my_bucket_path'
],
field_delimiter=','
)
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.