简体   繁体   中英

Upsert Cosmos item TTL using Azure Data Factory Copy Activity

I have a requirement to upsert data from REST API to Cosmos DB and also maintain the item level TTL for particular time interval.

I have used ADF Copy activity to copy the data but for TTL, used additional custom column at source side with hardcoded value 30.

在此处输入图像描述

Noticed that time interval (seconds) updating as string instead of integer. Hence failing with the below error.

Details Failure happened on 'Sink' side. ErrorCode=UserErrorDocumentDBWriteError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Documents failed to import due to invalid documents which violate some of Cosmos DB constraints: 1) Document size shouldn't exceeds 2MB; 2) Document's 'id' property must be string if any, and must not include the following charaters: '/', '', '?', '#'; 3) Document's 'ttl' property must not be non-digital type if any.,Source=Microsoft.DataTransfer.DocumentDbManagement,'

ttl Mapping between Custom column to cosmos DB

在此处输入图像描述

When i use ttl1 instead of ttl, it is getting success and value stored as string.

在此处输入图像描述

Any suggestion please?

Yes, that's the issue with additional columns in Copy activity. Even of you set it to int, it will change to string at the source.

The possible workaround is to create a Cosmos DB trigger in Azure function and add 'TTL' there.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM