简体   繁体   中英

Accessing metadata results in a nested pipeline for Azure Data Factory

I have a pipeline built that reads metadata from a blob container subfolder raw/subfolder. I then execute a foreach loop with another get metadata task to get data for each subfolder, it returns the following type of data. /raw/subfolder1/folder1, /raw/subfolder2/folder1, /raw/subfolder2/folder1 and so on. I need another foreach loop to access the files inside of each folder. The problem is that you cannot run a foreach loop inside of another foreach loop so I cannot iterate further on the files.

I have an execute datapipline that calls the above pipeline and then uses a foreach. My issue with this is that I'm not finding a way to pass the item().name from the above iteration to my new pipeline. It doesn't appear you can pass in objects form the previous pipeline? How would I be able to accomplish this nested foreach metat data gathering so I can iterate further on my files?

Have you tried using parameters? Here is how it would look like:

  1. In your parent pipeline, click on the "Execute Pipeline" activity which triggers the inner (your new pipeline) go to Settings and specify item name as a parameter "name". 在此处输入图像描述
  2. In your inner pipeline, click anywhere on empty space and add new parameter "name". 在此处输入图像描述
  3. Now you can refer to that parameter like this: pipeline().parameters.name

Using Parameters works in this scenario as @Andrii mentioned. For more on passing parameters between activities refer to this link. https://azure.microsoft.com/en-in/resources/azure-data-factory-passing-parameters/

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM