简体   繁体   English

使用 Python 在 Databricks 中的另一个笔记本中动态创建一个笔记本

[英]Create a notebook inside another notebook in Databricks Dynamically using Python

I am trying to create a notebook inside another notebook, the created notebook should contain both python code and sql code (using % sql,% python ).I need to run the created notebook from the parent notebook once it is created.Can anyone suggest a better way to do this. I am trying to create a notebook inside another notebook, the created notebook should contain both python code and sql code (using % sql,% python ).I need to run the created notebook from the parent notebook once it is created.Can anyone suggest一个更好的方法来做到这一点。

I found something like dbutils.notebook.run() -which will help me to run the already existing notebook, but seeking a method to create a notebook first and run it later.Any suggestion is appreciable!!我发现了类似dbutils.notebook.run()的东西 - 这将帮助我运行已经存在的笔记本,但寻求一种方法来首先创建笔记本并稍后运行它。任何建议都是可观的!

You can use the import command of the Databricks Workspace REST API .您可以使用Databricks 工作区 REST APIimport命令。

Something like this (put the notebook content into content value):像这样(将笔记本内容放入content值):

import requests
import os
import json

ctx = json.loads(dbutils.notebook.entry_point.getDbutils().notebook().getContext().toJson())
host_name = ctx['extraContext']['api_url']
host_token = ctx['extraContext']['api_token']
notebook_path = ctx['extraContext']['notebook_path']
new_path = os.path.join(os.path.dirname(notebook_path), 'New name')

data = {
  "content": "some code",
  "path": new_path,
  "language": "PYTHON",
  "overwrite": true,
  "format": "SOURCE"
}

response = requests.post(
    '{host_name}/api/2.0/workspace/import',
    headers={'Authorization': f'Bearer {host_token}'},
    data = data
  ).json()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM