简体   繁体   English

终止从 Airflow 到 spark/yarn 的进程信号

[英]Kill process signal from Airflow to spark/yarn

Do someone know how next feature can be implemented?有人知道如何实现下一个功能吗? When you marked task in airflow as failed this stops Airflow process however this didn't stoped application running in yarn.当您将 airflow 中的任务标记为失败时,这将停止 Airflow 进程,但这并没有停止在 yarn 中运行的应用程序。

Something similar to类似于

except KeyboardInterrupt:

when the application running in terminal.当应用程序在终端中运行时。

You can call functions either on success or on failure by using Callback concepts https://airflow.apache.org/docs/apache-airflow/2.2.2/logging-monitoring/callbacks.html您可以使用回调概念在成功或失败时调用函数https://airflow.apache.org/docs/apache-airflow/2.2.2/logging-monitoring/callbacks.html

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM