I have a use case of running a spark job everyday. I am using databricks to run the job. Since it is a daily job, I would like to create a cluster, run the notebook and destroy the cluster. I am using data factory to do that. But I am not seeing any option to customise the "Inactivity period" in data factory when creating the linked service of databricks on creating the cluster.
How can I destroy the cluster once my job is completed?
Just choose "new job cluster". Job clusters are only active during the job lifetime.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.