简体   繁体   中英

Editing the Azure Storage Account in Azure Databricks

I have databricks pointing to a storage account in Azure but the region was incorrect. Now I want to change it and point it to a different storage account. I have used the mount option with the code as below

dbutils.fs.mount(
    source = "wasbs://" + mountname + "@" + storageAccount + ".blob.core.windows.net",
    mount_point = root + mountname ,
    extra_configs = {"fs.azure.account.key." + storageAccount + ".blob.core.windows.net":dbutils.secrets.get(scope = "", key = "")})

This executes properly, but once I use %fs ls dbfs:/mnt/ to list the directories, it shows the directories of the old storage account.

Do let me know how I can achieve this if it is possible?

All you need to do, just unmount the existing storage account and mount it with correct storage account which you are referring to different storage account.

OR

Create a new mount point with reference to the new storage account.

Unmount a mount point:

dbutils.fs.unmount("/mnt/<mountname>")

在此处输入图像描述

To mount a Blob Storage container or a folder inside a container, use the following command:

dbutils.fs.mount(
  source = "wasbs://<container-name>@<storage-account-name>.blob.core.windows.net/<directory-name>",
  mountPoint = "/mnt/<mount-name>",
  extraConfigs = Map("<conf-key>" -> dbutils.secrets.get(scope = "<scope-name>", key = "<key-name>")))

在此处输入图像描述

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM