简体   繁体   中英

Problems with inital refresh of Power BI Data Flow - Too demanding on data source

I am faced with a challenge I would love to get some pointers for.

I have a couple of very large tables housed in an SQL Server instance that is also the primary production table for the application it supports.

I want to load 2 years worth of historical data, after which I will be implementing incremental refresh. The problem is that when I tried to do the initial load, the end users of the application experienced time outs and all the other stuff we want to see in production.

I am looking for a way to feed the data flow either in small steps of a month or smaller, or provide the historical data separately via a.csv or in some other way.

Can anybody share some insight on how to go about this? I've tried researching this issue, but I've not found a way so far.

Thank you in advance!

If you dont have the option to mirror your data i would suggest a dirty read.

SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED

BEGIN TRANSACTION

-- your sql script

COMMIT TRANSACTION

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM