I'm calling ExecuteMultipleRequest
to insert 25 records of a custom entity at a time. Each batch is taking roughly 20 seconds. Some info about the custom entity:
On each CreateRequest
the entity has 6 attribute values filled: 2 Lookup and 4 Money. ExecuteMultipleRequest
is being called from a middleware component in a corporate network, which connects to the CRM in the cloud. The CRM instance used is a sandbox, so it may have some restrictions (CPU/bandwidth/IO/etc), that I'm not aware of.
I can issue concurrent requests, but considering I can only have 2 concurrent requests per organization ( https://msdn.microsoft.com/en-au/library/jj863631.aspx#limitations ), it would only cut the time in half. That is still not a viable time. For each new custom CRM process created I need to load at most 5000 entity records, in less than 10 minutes.
What can I do to improve the performance of this load? Where should I be looking at? Would a DataImport ( https://msdn.microsoft.com/en-us/library/hh547396.aspx ) be faster than ExecuteMultipleRequest
?
Only really got suggestions for this, you would probably have to experiment and investigate to see what works for you.
Can you run your middleware application in a physical location closer to your CRM Online site?
ExecuteMultipleRequest
supports much larger batch sizes, up to 1000.
Have you compared to just using a single execute request.
Do you have lots of processes (workflows, plugins) that occur in CRM when the data import is running? This can have a big performance impact. Perhaps these can be disabled during data import. Eg you could pre-process the data before import so a plugin wouldnt need to be executed.
The concurrent requests limitation only applied to ExecuteMultipleRequest
, have you tried running lots of parallel single execute requests?
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.