I want to import existing Databricks infrastructure to Terraform, but I can't import existing mounts. I have a mount to the S3 bucket on AWS which is as follows: dbfs:/mnt/copyprod
. According to the official documentation of databricks provider this command should work:
$ terraform import databricks_mount.this <mount_name>
I have created appropriate resource block
resource "databricks_mount" "copyprod" {
...
}
but when I try to run command terraform import databricks_mount.copyprod copyprod
, which worked for other resources, I always get the same error:
databricks_mount.copyprod: Importing from ID "copyprod"...
databricks_mount.copyprod: Import prepared!
Prepared databricks_mount for import
databricks_mount.copyprod: Refreshing state... [id=copyprod]
╷
│ Error: value of name is not specified or empty
What "value of name" is this? Where should I specify it? As I understand there is no need to define arguments in the resource block for import to work as it only updates .tfstate
file, but even if I do (such as defining name
, bucket_name
etc.), the error is always the same. I also tried to pass <mount_name>
in different styles, but with no luck. How can I make this work?
Connection to Databricks is rather correct, as I can import other resources such as clusters or notebooks. I am using Terraform v1.0.9 and Databricks provider v0.4.2.
This is a bug in the provider, and it looks like that it doesn't work correctly for a some time (although with different error) - I've tested with old implementations ( databricks_aws_s3_mount
and databricks_azure_adls_gen2_mount
) on 0.3.5 & 0.3.10 (versions before databricks_mount
resource was added).
Please report this issue on Github (mention me in the text), I'll look into it early next week, or maybe over weekend if I have time.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.