简体   繁体   中英

Parameterizing the Container name in the Script of Azure Data Flow ARM template

Trying to deploy the ADF data flow to multiple environments using ARM templates. I was able to deploy a hardcoded script of the data flow. But I need to parametrize the storage account container in the script.

This is the part of the script

source(allowSchemaDrift: true,\n\tvalidateSchema: false,\n\tignoreNoFilesFound: false,\n\tformat: 'delimited',\n\tcontainer: ' containerName ',\n\tcolumnDelimiter: ',',\n\tescapeChar: '\\',\n\tquoteChar: '\"',\n\tcolumnNamesAsHeader: false,\n\twildcardPaths:['FolderName/FileName*']) ~> source1)

I did try the below, concat also doesn't work as there are many single quotes in the script

  • @{variables('containerName')}
  • @{[variables('containerName')]}

Is there a way to parametrize the script part of the Dataflow ARM

I struggled a lot with ARM templates with a similar issue, then I tried something a little bit different that worked for me using a pipeline global parameter. This global parameter can be easily overwritten when deploying the ARM templates from Azure Devops Releases (ie moving the templates from different DEV to PROD data factories).

As dataflows doesn't have access to global parameters, I did this way: My dataflow has a parameter containerName, then I'm always overwriting it from the pipeline calling the dataflow, with the value of the global parameter with same name (using this pipeline expression: @pipeline().globalParameters.containerName)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM