简体   繁体   中英

Creating a container in azure storage before uploading the data using ADF v2

Thanks in advance, I'm new to ADF and have created a pipeline from ADF portal. Source On premise server folder and destination dataset is Azure Blob storage. I'm using tumbling window which passes the date start time and Date End time and only uploads the latest data using lastmodified datetime.

Query : If i want to create subcontainers on fly in azure storage i use /container/$monthvariable and it automatically creates a subcontainer based on the month variable

example here my source is

dfac/
$monthvariable = 5

if i put

dfac/$monthvariable

then all the files will be uploaded under dfac/5/ and will look like below

dfac/5/file1
dfac/5/file2
dfac/5/file3

Here in ADF i wanted to get the month of the pipeline month and add that in pipeline. Is that what i can do? and where can i define the variable?

  {
            "name": "Destination",
            "value": "dfac/$monthvariable"// does it work and is this the right way to do this stuff
        }

My Actual code looks like below.

{
    "name": "Copy_ayy",
    "type": "Copy",
    "policy": {
        "timeout": "7.00:00:00",
        "retry": 2,
        "retryIntervalInSeconds": 30,
        "secureOutput": false,
        "secureInput": false
    },
    "userProperties": [
        {
            "name": "Source",
            "value": "/*"
        },
        {
            "name": "Destination",
            "value": "dfac/"
        }
    ],
    "typeProperties": {
        "source": {
            "type": "FileSystemSource",
            "recursive": true
        },
        "sink": {
            "type": "BlobSink",
            "copyBehavior": "PreserveHierarchy"
        },
        "enableStaging": false
    },
    "inputs": [
        {
            "referenceName": "SourceDataset_ayy",
            "type": "DatasetReference",
            "parameters": {
                "cw_modifiedDatetimeStart": "@pipeline().parameters.windowStart",
                "cw_modifiedDatetimeEnd": "@pipeline().parameters.windowEnd"
            }
        }
    ],
    "outputs": [
        {
            "referenceName": "DestinationDataset_ayy",
            "type": "DatasetReference"
        }
    ]
}

I believe you are using copy data tool. Then you can also use it help you for the destination path part. It will help you create the parameters. 在此处输入图片说明

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM