I want to run azure databricks notebook from a python file I have client_id, secret and token id. I tried to run it by creating a databricks client but there is no package that can run a databricks notebook it seems. TIA for any suggestions
The answer should run a data bricks notebook like we run a datafactory like this code does
adf_client = DataFactoryManagementClient(credentials, subscription_id)
run_response = adf_client.pipelines.create_run(rg_name, df_name, df_pipeline_name, parameters=...............
pipeline_run = adf_client.pipeline_runs.get(rg_name, df_name, run_response.run_id)
status = pipeline_run.status
while status == 'Queued' or status == 'InProgress':
time.sleep(5)
status = adf_client.pipeline_runs.get(rg_name, df_name, run_response.run_id).status
You can use the databricks rest APIs to trigger databricks jobs. You have to first configure a job with a cluster and a notebook.
You can check this blog which demonstrates this. The blog talks about calling the APIs via postman. You just have to replace this with python code.
The official databricks rest API documentation to trigger a job can be found here . Databricks documentation also shows how to call the APIs using python code.
You can use a databricks token or AAD bearer token for authorization.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.