[英]DataJobs Airflow Dags for GCP and Azure
Implement Operator to abstract cloud provider (GCP & Azure) -specific calls (SQL/Python/Spark)实施 Operator 以抽象云提供商(GCP 和 Azure)特定调用(SQL/Python/Spark)
I have found this documentaion:- https://airflow.apache.org/docs/apache-airflow/1.10.14/_modules/airflow/contrib/example_dags/example_gcp_sql.html我找到了这个文档:- https://airflow.apache.org/docs/apache-airflow/1.10.14/_modules/airflow/contrib/example_dags/example_gcp_sql.html
But i am not clear how to implement this according to my task但我不清楚如何根据我的任务实现这个
You might want to look through the available operators of the Google and Azure Airflow providers if an operator exists that fits your use case.如果存在适合您的用例的运营商,您可能需要查看Google和Azure Airflow 提供商的可用运营商。
The Astronomer registry has some example DAGs with implementations of some of the operators in those packages (Google example DAGs , Azure example DAGs ) Astronomer registry 有一些示例 DAG,其中包含这些包中某些运算符的实现(Google 示例 DAG , Azure 示例 DAG )
Disclaimer: I work at Astronomer:)免责声明:我在天文学家工作:)
PS: The link you shared points to very outdated documentation of pre-2 Airflow. It is very possible for that DAG code to throw errors in modern Airflow. PS:您共享的链接指向 pre-2 Airflow 的非常过时的文档。该 DAG 代码很可能在现代 Airflow 中抛出错误。
EDIT:编辑:
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.