简体   繁体   中英

Azure Databricks terminate the cluster from DAta factory

I have a use case of running a spark job everyday. I am using databricks to run the job. Since it is a daily job, I would like to create a cluster, run the notebook and destroy the cluster. I am using data factory to do that. But I am not seeing any option to customise the "Inactivity period" in data factory when creating the linked service of databricks on creating the cluster.

How can I destroy the cluster once my job is completed?

Just choose "new job cluster". Job clusters are only active during the job lifetime.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM