简体   繁体   中英

Azure datafactory multiple tables

I have a one business scenario, we have to pull all the tables from the one database let say adventure-work and put all the tables information in separate csv in data lake. suppose in adventure works db if we have 20 tables I need to pull all the table paralleling and each table contains one csv ie 20 tables will contain 20 csv in azure data lake. How to do using Azure data factory.Kindly don't use for-each activity it takes files sequentially and time consuming.

In Data Factory, there are two ways can help you create 20 csv files from 20 tables in one pipeline: for-each activity and Data Flow .

In Data Flow, add 20 Sources and Sink, for example: 在此处输入图像描述

No matter which way, the copy active must be sequentially and take some time.

What you should do is to think about how to improve the copy data performance like Thiago Gustodio said in comment, it can help your same the time. 在此处输入图像描述

For example, set more DTUs of your database, using more DIU for your copy active.

Please reference these Data Factory documents:

  1. Mapping data flows performance and tuning guide
  2. Copy activity performance and scalability guide

They all provide performance supports for you.

Hope this helps.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM