簡體   English   中英

Apache Airflow 運行 DAG 時出錯(錯誤 - [Errno 2] 沒有這樣的文件或目錄)

[英]Apache Airflow Error Running DAG (ERROR - [Errno 2] No such file or directory)

我試圖找出為什么我們會收到此錯誤。 它是缺少的依賴項嗎? 版本問題? 為什么只有這個 DAG 而不是其他 DAG 會發生這種情況?

錯誤是:

FileNotFoundError: [Errno 2] No such file or directory: /home/airflow/composer_kube_config

這是我們的 DAG:

import datetime

from airflow import DAG
from airflow.contrib.operators.kubernetes_pod_operator import KubernetesPodOperator
from airflow.contrib.kubernetes.secret import Secret
from airflow.contrib.kubernetes.volume import Volume
from airflow.contrib.kubernetes.volume_mount import VolumeMount
# from airflow.contrib.kubernetes.pod import Port

from utils.constants import DEFAULT_ARGS, DUMB_BUCKET, SCHEMA_BUCKET, PROJECT, \
    CLOUD_COMPOSER_SERVICE_ACCOUNT_SECRET, STAGING_BUCKET

volume_mount = VolumeMount(
    'secret',
    mount_path='/etc/secret',
    sub_path=None,
    read_only=True
)
volume_config= { 'persistentVolumeClaim': { 'claimName': 'all-ftp' } }
volume = Volume(name='secret', configs=volume_config)

with DAG( 'ftp_file_poller', schedule_interval="55 6 * * *", start_date=datetime.datetime(2020,7,1) ) as dag:
    poller = KubernetesPodOperator(
        secrets=[CLOUD_COMPOSER_SERVICE_ACCOUNT_SECRET],
        task_id='ftp-file-poller',
        name='ftp-polling',
        cmds=['ftp-poller'],
        namespace='default',
        image='us.gcr.io/<our gcp project>/ftp-poller:v7',
        is_delete_operator_pod=True,
        get_logs=True,
        volumes=[volume],
        volume_mounts=[volume_mount]
    )
    poller.doc = """
    about this DAG info
    """

這是我在文檔中找到的有關此文件的引用:

  # Only name, namespace, image, and task_id are required to create a
  # KubernetesPodOperator. In Cloud Composer, currently the operator defaults
  # to using the config file found at `/home/airflow/composer_kube_config if
  # no `config_file` parameter is specified. By default it will contain the
  # credentials for Cloud Composer's Google Kubernetes Engine cluster that is
  # created upon environment creation.

這可以通過將 DEFAULT_ARGS 常量添加到 DAG 定義中來解決,如下所示:

with DAG(dag_id='ftp_file_poller',
        schedule_interval="55 6 * * *",
        start_date=datetime.datetime(2020,7,1),
        default_args=DEFAULT_ARGS) as dag:

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM