简体   繁体   English

我用BashOperator在Airflow中执行了一个python的文件,如何导入其他自定义的function?

[英]I use BashOperator to execute a python file in Airflow, how to import other self-defined function?

I use BashOperator to execute a python file called app.py in Airflow. I wrote another python script called to_es.py.我使用BashOperator在Airflow中执行了一个名为app.py的python文件。我写了另一个名为to_es.py的python脚本。 There is a function called "df_to_es()" in it.里面有一个叫做“df_to_es()”的function。

The app.py should call df_to_es() by from utils.to_es import df_to_es, but the Airflow throws an error in red words: 'there is no module called "def_to_es"'. app.py 应该通过 from utils.to_es import df_to_es 调用 df_to_es(),但是 Airflow 会抛出一个红色的错误:'没有名为“def_to_es”的模块'。

Finally I come to answer my own question.最后我来回答我自己的问题。
Even though Airflow may indicate that there is a DAG import error, but if you use BashOperator to execute your Python script, you import your own python functions, classes and modules in that script, they work smoothly if you don't have some other errors.尽管 Airflow 可能表明存在 DAG 导入错误,但是如果您使用 BashOperator 执行您的 Python 脚本,您在该脚本中导入您自己的 python 函数、类和模块,如果您没有其他错误,它们将顺利运行. Just double check if you are using correct Airflow DAG directory.只需仔细检查您是否使用了正确的 Airflow DAG 目录。
So just ignore that DAG import error if you are in my situation.因此,如果您处于我的情况,请忽略 DAG 导入错误。 This is something that Airflow develop team need to improve.这是Airflow开发团队需要改进的地方。 Something like unit test.像单元测试之类的东西。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM