简体   繁体   English

Airflow clear Tasks 是否也会从数据库中删除数据?

[英]Does Airflow clear Tasks remove data from the database also?

We have a use case where we have to populate fresh data in our DB.我们有一个用例,我们必须在数据库中填充新数据。 There is already old data present from the successful DAG run in our DB .我们的 DB 中已经存在来自成功 DAG 运行的旧数据。 Now we need to delete the old data and re run the task.现在我们需要删除旧数据并重新运行任务。 Airflow already provides a command to clear selection. Airflow 已经提供了清除选择的命令。

airflow clear -dx occupancy_reports.* -t building -s 2022-04-01 -e 2022-04-30

Will running this also delete the data from the Database and then populate fresh data ?运行它还会从数据库中删除数据然后填充新数据吗?

I guess you meant : airflow **tasks** clear ...我猜你的意思是: airflow **tasks** clear ...

It is only clear the set of task instance, as if they never ran (it is not rollback)只是明确了任务实例的集合,就好像他们从来没有跑过一样(不是回滚)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用气流的 postgres 数据库获取应用程序数据 - Using airflow's postgres database for application data 如何使用 Airflow 将 CSV 文件从 Azure Data Lake/Blob 存储传输到 PostgreSQL 数据库 - How to transfer a CSV file from Azure Data Lake/Blob Storage to PostgreSQL database with Airflow Airflow 1.9-任务卡在队列中 - Airflow 1.9 - Tasks stuck in queue Airflow 调度器启动任务失败 - Airflow scheduler fails to start tasks 数据不会从 CRUD 页面中删除,也不会在数据库中删除 - The Data doesn't get deleted from CRUD page and also doesn't get deleted in Database CrudRepository 删除是否也会删除 Blob 字段的底层大对象 - Does a CrudRepository delete also remove underlying large objects of Blob fields 气流:将巨大的数据集从数据库传输到远程计算机 - Airflow: transfer a huge dataset from a database to a remote machine 为什么我的 Airflow 任务排队但没有运行? - Why are my Airflow tasks queued but not running? array_agg去除重复项,其中元素也是arrays - Remove duplicates from array_agg, where elements are also arrays 将Sonarqube 6.5升级到7.1,并从旧服务器和数据库迁移 - Upgrade Sonarqube 6.5 to 7.1 and also migrate from old server and database
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM