简体   繁体   中英

Azure Data factory Copy data Blob to Cosmos db - need help skipping 2MB files

I have a Azure Data Factory Copy Activity within a pipeline - I'm copying data from Blob container /jsons in multiple virtual folders/ to Cosmos DB. However, fringe cases exist and cannot be escaped, where files larger than 2MB are placed in the Blob storage. When the copy activity picks them, the transfer /and subsequent pipeline activities/ fail as I hit the 2MB hard limit for CosmosDB. I have tried setting up a lookup activity / get metadata but can't seem to address properly the relevant (size) property and the output necessary for the delete activity.

Can anyone advise on a an approach on how to handle this?

Thank you.

It should be possible to get the size of files in Get Metadata activity.But please note it is in bytes and only could be applied on the file.

在此处输入图片说明

As i know,no way to avoid 2mb limitation of cosmos db document.You could refer to this case: What is the size limit of a single document stored in Azure Cosmos DB

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM