[英]How to store data incrementallyfrom azure SQL database to azure blob storage using azure data factory,based on some conditions:date need to be updated
i need to store some data from sql server to azure blob storage in which the data in sql server will be updated, and the time stamp will be mentioned along with it and also whenever i store it in storage it will be recording some timestamp with it.我需要将一些数据从 sql server 存储到 azure blob 存储中,其中 sql server 中的数据将被更新,并且时间戳将与它一起被提及,并且每当我将它存储在存储中时,它都会记录一些时间戳. so based on the condition if the timestamp in sql server is greater than timestamp in blob storage i need to migrate that data i tried migrating by moving the data from sql server to storage account, but there might be some changes in future in the sql server so based on that how to migrate the new data periodically
因此基于该条件,如果在SQL Server中的时间戳比Blob存储时间戳更大,我需要迁移数据i tried migrating by moving the data from sql server to storage account, but there might be some changes in future in the sql server so based on that how to migrate the new data periodically
I would suggest you go through some Pipeline template我建议你通过一些管道模板
Perhaps...也许...
If you do have timestamp column in your source database to identify the new or updated rows, but do not want to create any external control table to achive delta copy, you can go to "copy data tool" to get a pipeline, which use trigger scheduled time as a variable to read the new rows only from source database.如果源数据库中确实有时间戳列来标识新行或更新行,但不想创建任何外部控制表来实现增量复制,则可以转到“复制数据工具”以获取使用触发器的管道计划时间作为变量仅从源数据库读取新行。
View documentation from MS for detailed walkthrough查看来自 MS 的文档以获取详细的演练
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.