简体   繁体   中英

Airflow, on failure, make all dags do something in specific

We have a lot of DAGs running on Airflow. When something fails, we want to be notified, or make a specific action: I have tried via decorator

def on_failure_callback(f):
  @wraps(f)
  def wrap(*args, **kwargs):
    try:
      return f(*args, **kwargs)
    except Exception as e:
      return f"An exception {e} on ocurred on{f}" 
  return wrap

This works, but it is necessary to decorate any function we want to have this behavior on.

I saw this and try to implement it like this:

def on_failure_callback(context):
    operator = PythonOperator(
        python_callable=failure)

    return operator.execute(context=context)


def failure():
    return 'Failure in the failure func'


dag_args = {
    "retries": 2,
    "retry_delay": timedelta(minutes=2),
    'on_failure_callback': on_failure_callback
}

And then on the DAG definition, I use [...] default_args=dag_args [...] , but this option is not working.

What is the best way to accomplish this?

Thanks

Simplest way: airflow sends mail on retry and fail if email_on_retry and email_on_failure attributes from BaseOperator are true (default true), and airflow mail configuration is set.

With custom operator:

def on_failure_callback(context):
    # with mail:
    error_mail = EmailOperator(
        task_id='error_mail',
        to='user@example.com',
        subject='Fail',
        html_content='a task failed',
        mime_charset='utf-8')
    error_mail.execute({})  # no need to return, just execute

    # with slack:
    error_message = SlackAPIPostOperator(
        task_id='error_message',
        token=getSlackToken(),
        text='a task failed',
        channel=SLACK_CHANNEL,
        username=SLACK_USER)
    error_message.execute({})  # no need to return, just execute

dag_args = {
    "retries": 2,
    "retry_delay": timedelta(minutes=2),
    'on_failure_callback': on_failure_callback
}

The easiest way, IMO is to define it as default argument if DAG failed.

default_args = { 'owner': 'airflow', 'depends_on_past': False, 'start_date': datetime(2015, 6, 1), 'email': ['airflow@example.com'], 'email_on_failure': False, ' email_on_retry': False , 'retries': 1, 'retry_delay': timedelta(minutes=5), # 'queue': 'bash_queue', # 'pool': 'backfill', # 'priority_weight': 10, # 'end_date': datetime(2016, 1, 1), }

You can also use a sendgrid operator if you want to specify the behavior of sending email based on your task dependency. https://github.com/apache/airflow/blob/master/airflow/contrib/utils/sendgrid.py

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM