简体   繁体   English

将 Azure Blob 存储中的新 CSV 加载到 SQL DB

[英]Loading a new CSV in Azure Blob Storage to SQL DB

I am loading a csv file into an Azure Blob Storage account.我正在将 csv 文件加载到 Azure Blob 存储帐户中。 I would like a process to be triggered when a new file is added, that takes the new CSV and BCP loads it into an Azure SQL database.我希望在添加新文件时触发一个进程,该进程采用新的 CSV 和 BCP 将其加载到 Azure SQL 数据库中。

My idea is to have an Azure Data Factory pipeline that is event triggered.我的想法是拥有一个事件触发的 Azure 数据工厂管道。 However, I am stuck as to what to do next.但是,我不知道下一步该做什么。 Should an Azure Function be triggered that takes this CSV and uses BCP to load it into the DB?是否应该触发 Azure Function 并使用此 CSV 并将其加载到数据库中? Can Azure Functions even use BCP? Azure 函数甚至可以使用 BCP 吗?

I am using Python.我正在使用 Python。

I would like to please check below link.我想请检查下面的链接。 Basically you want to copy new files as well the modified file for that single copy data is used full.基本上,您要复制新文件,并且该单个副本数据的修改文件已全部使用。 Use event based trigger(when files in created) instead on schedule one.使用基于事件的触发器(当创建文件时)而不是按计划一。

https://www.mssqltips.com/sqlservertip/6365/incremental-file-load-using-azure-data-factory/ https://www.mssqltips.com/sqlservertip/6365/incremental-file-load-using-azure-data-factory/

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM